2014 IEEE International Conference on Multimedia and Expo (ICME) 2014
DOI: 10.1109/icme.2014.6890301
|View full text |Cite
|
Sign up to set email alerts
|

Continuous emotion detection using EEG signals and facial expressions

Abstract: Emotions play an important role in how we select and consume multimedia. Recent advances on affect detection are focused on detecting emotions continuously. In this paper, for the first time, we continuously detect valence from electroencephalogram (EEG) signals and facial expressions in response to videos. Multiple annotators provided valence levels continuously by watching the frontal facial videos of participants who watched short emotional videos. Power spectral features from EEG signals as well as facial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
53
0
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 82 publications
(54 citation statements)
references
References 21 publications
0
53
0
1
Order By: Relevance
“…Muscle artifacts can affect the patterns of EEG signals. Soleymani et al [60] thought that the correlation between the EEG features and continuous valence was caused by a combination of the effect from the facial expression and brain activities in their study. However, we think that the topographs in Fig.…”
Section: Deltamentioning
confidence: 98%
“…Muscle artifacts can affect the patterns of EEG signals. Soleymani et al [60] thought that the correlation between the EEG features and continuous valence was caused by a combination of the effect from the facial expression and brain activities in their study. However, we think that the topographs in Fig.…”
Section: Deltamentioning
confidence: 98%
“…Since physiological reactions are considered to be an important component of emotions [15], [4], their measurements provide insight into spectators' aesthetic experience elicited by particular scenes [24]. In the field of affective computing, researchers have attempted to investigate emotion recognition in responses to multimedia content using electroencephalography (EEG) signals, peripheral physiological signals and facial expressions [14], [22]. The combination of spectators' physiological signals has been proposed in [5].…”
Section: Introductionmentioning
confidence: 99%
“…[9] with the discrete emotional keywords [10] Many works with promising results have been proposed on the EEG-based affective classification problem [1], [2], [11]- [13], however, majority of the current studies were focused on spectral power within a set of broad-bands, Theta (4-8Hz), Alpha (8-12 Hz), Beta (13-30Hz) or spectral power ratio between symmetrical pairs of electrodes, also referred to as differential symmetry features. Although good results have been reported on such feature extraction approaches, the use of differential symmetry features is limiting [3], particularly with the fundamental non-medical application such as affective human computer interfaces where the limitation on the number of electrodes and their spacial location is pronounced. The assumption that electrodes will be present in pairs might not be realistic or practical, therefore, in this paper we attempt to explore the narrow-band spectral power variations under various affective states for discriminating features, with the hope of eliminating the reliance on electrode pairing.…”
Section: A Eeg and The Affective Statesmentioning
confidence: 98%
“…However, since EEG signal originates from a very complex brain circuit with few known parameters,it is also a very challenging task to utilize EEG signal for such the pattern classification problem. Although there have been many promising EEG-based affective signal processing systems proposed in recent years [1]- [3], however, few cross-subject classification study [4] have been reported. Most common practice is the single subject, leaveone response-out experimental setup [5], [6].…”
Section: Introductionmentioning
confidence: 99%