2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII) 2019
DOI: 10.1109/acii.2019.8925470
|View full text |Cite
|
Sign up to set email alerts
|

Eye-based Continuous Affect Prediction

Abstract: Eye-based information channels include the pupils, gaze, saccades, fixational movements, and numerous forms of eye opening and closure. Pupil size variation indicates cognitive load and emotion, while a person's gaze direction is said to be congruent with the motivation to approach or avoid stimuli. The eyelids are involved in facial expressions that can encode basic emotions. Additionally, eye-based cues can have implications for human annotators of emotions or feelings. Despite these facts, the use of eye-ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 41 publications
1
16
0
Order By: Relevance
“…To improve performance, they suggested to include additional modalities and gaze features in future work. O'Dwyer et al [50] explored the use of a larger gaze feature set to train an LSTM network for the task of continuous affect prediction from the RECOLA dataset [55]. They found that their model performed better for arousal prediction when trained on gaze features.…”
Section: Gaze-based Emotion Recognitionmentioning
confidence: 99%
See 3 more Smart Citations
“…To improve performance, they suggested to include additional modalities and gaze features in future work. O'Dwyer et al [50] explored the use of a larger gaze feature set to train an LSTM network for the task of continuous affect prediction from the RECOLA dataset [55]. They found that their model performed better for arousal prediction when trained on gaze features.…”
Section: Gaze-based Emotion Recognitionmentioning
confidence: 99%
“…For example, gaze aversion was shown to impair the perception of anger and happiness [2,10], and embarrassment is connected to more downward gaze than amusement [34]. Despite the importance of gaze, relatively few works have studied emotion recognition based on gaze location and pupil size [5] or combined gaze with other channels of affective information [3,50,62]. While the performance improvements demonstrated by these approaches underline the importance of integrating gaze into emotion recognition systems, they either rely on video information only [62] or assume both video-based gaze features and speech input to be available both at training and test time [3,50].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Eye-tracking-based UIs are, for example, various assistive technology solutions for people with severe disabilities (e.g., [ 47 ]) that cannot use arms and standard input devices. However, eye-based cues (e.g., eye gaze) are another field of increasing interest to the research community for automatic emotion classification and affect prediction [ 48 ].…”
Section: Backgrounds and Related Workmentioning
confidence: 99%