2020
DOI: 10.1155/2020/2909267
|View full text |Cite
|
Sign up to set email alerts
|

Eye-Tracking Analysis for Emotion Recognition

Abstract: This article reports the results of the study related to emotion recognition by using eye-tracking. Emotions were evoked by presenting a dynamic movie material in the form of 21 video fragments. Eye-tracking signals recorded from 30 participants were used to calculate 18 features associated with eye movements (fixations and saccades) and pupil diameter. To ensure that the features were related to emotions, we investigated the influence of luminance and the dynamics of the presented movies. Three classes of emo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
35
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 49 publications
(37 citation statements)
references
References 47 publications
1
35
0
1
Order By: Relevance
“…However, the highest accuracy received lies below 58% for [184] and 54% for [185] respectively. In [186], a classification between the emotional arousal and valence is attempted while exploiting various eye features correlated with fixations, saccades as well as pupil size resulting to the relatively high accuracy of 80.00% taking into account the three class classification problem. The other two studies concern the binary depressed/non-depressed classification problem.…”
Section: A Emotional Arousal Recognitionmentioning
confidence: 99%
“…However, the highest accuracy received lies below 58% for [184] and 54% for [185] respectively. In [186], a classification between the emotional arousal and valence is attempted while exploiting various eye features correlated with fixations, saccades as well as pupil size resulting to the relatively high accuracy of 80.00% taking into account the three class classification problem. The other two studies concern the binary depressed/non-depressed classification problem.…”
Section: A Emotional Arousal Recognitionmentioning
confidence: 99%
“…Prior work indicated that PD changes can be used as an indicator of arousal states [59], but also are largely affected by the lighting conditions [60]. Recently, Pfleging et al [61] and Tarnowski et al [62] modelled PD as the sum of two contributing factors: (1) PD given lighting conditions, (2) PD given experiences from task. In our study, since 360 • videos were played around and near to the eyes, there was no light source except for the presentation of 360 • videos.…”
Section: Hm and Em Data Analysismentioning
confidence: 99%
“…PD p,average is the average PD of both eyes recorded participant p, while PD is the PD given luminance condition of video v. Following Tarnowski et al's work [62], we used linear regression method (coefficients k, b) to model the relationship between PD and luminance of video v for participant p:…”
Section: Pd Data Analysismentioning
confidence: 99%
“…Liu, and D. Zhang (2018) and S. Koelstra and I. Patras (2013) based on the results of EEG signals and facial expressions. Tarnowski et al (2020) report obtaining 80 % accuracy when recognizing three classes of emotions (high arousal and high valence; high arousal and low valence; low arousal and moderate valence) with eye-tracking tools. As a result, we may conclude that such techniques can be beneficially used in practice.…”
Section: • Context Formation and Consumer Education Before Visitsmentioning
confidence: 99%