2012
DOI: 10.1109/t-affc.2011.37
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Emotion Recognition in Response to Videos

Abstract: Abstract-This paper presents a user-independent emotion recognition method with the goal of recovering affective tags for videos using electroencephalogram (EEG), pupillary response and gaze distance. We first selected 20 video clips with extrinsic emotional content from movies and online resources. Then, EEG responses and eye gaze data were recorded from 24 participants while watching emotional video clips. Ground truth was defined based on the median arousal and valence scores given to clips in a preliminary… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
302
1
5

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 583 publications
(313 citation statements)
references
References 46 publications
5
302
1
5
Order By: Relevance
“…[104] Employing images, sounds, and videos for emotion elicitation is also motivated by affective tagging applications, which consist in automatically assigning tags to multimedia contents. [105,106] In a psychophysiological study of emotion induced by music and film stimuli, Stephens et al [107] replicated the finding of autonomic specific basic emotions and demonstrated that the phenomenon of autonomic nervous system (ANS) specificity of emotion was not a function of the emotion induction technique. In [108], the authors showed that the emotion assessment performance obtained using visual and auditory stimuli is similar.…”
Section: Emotion Elicitationmentioning
confidence: 97%
See 1 more Smart Citation
“…[104] Employing images, sounds, and videos for emotion elicitation is also motivated by affective tagging applications, which consist in automatically assigning tags to multimedia contents. [105,106] In a psychophysiological study of emotion induced by music and film stimuli, Stephens et al [107] replicated the finding of autonomic specific basic emotions and demonstrated that the phenomenon of autonomic nervous system (ANS) specificity of emotion was not a function of the emotion induction technique. In [108], the authors showed that the emotion assessment performance obtained using visual and auditory stimuli is similar.…”
Section: Emotion Elicitationmentioning
confidence: 97%
“…[113] Focusing on valence and arousal, Soleymani et al [105] argue that the arousal dimension is better discriminated by brain activity than the valence dimension. When looking at the studies which analyzed both the classification of valence and arousal on two classes [105,106,110,114,115] the valence accuracy is only marginally higher than the arousal accuracy (valence mean accuracy is 65.6%, arousal mean accuracy is 68.2%), and it is difficult to conclude any potential advantage of neurophysiological signals for arousal assessment. It is unfortunately difficult to compare valence-arousal results with those obtained with basic emotions due to the difference in the number of classes employed.…”
Section: Assessed Emotionsmentioning
confidence: 99%
“…Moreover, in comparison with still pictures and sounds, film clips have the ability to elicit more intensive emotional responses that lead to activations in cognitive, experiential, central physiological, peripheral physiological, and behavioral systems (Rottenberg et al 2007). In accordance with this argument, some recent studies in affective computing, for instance (Soleymani et al 2011), relied on film clips for induction of emotional states. Eventually, film clips were chosen as a method of eliciting archetypal experiences in our participants because they proved their effectiveness in the induction of emotions and it was not clear which type of media would work best for archetypes.…”
Section: Stimulimentioning
confidence: 86%
“…Participants watched a subset of images extracted from International Affective Picture system (IAPS) [35] and on a five points likert scale rated their preference for choosing the picture for their desktop wallpaper. The LDOS-PerAff-1 database is available online 4 .…”
Section: B Databasesmentioning
confidence: 99%
“…The users' behavior and spontaneous reactions to multimedia data can provide useful information for multimedia indexing with the following scenarios: (i) direct assessment of tags: users spontaneous reactions will be translated into emotional keywords, e.g., funny, disgusting, scary [3], [4], [5], [6]; (ii) assessing the correctness of explicit tags or topic relevance, e.g., agreement or disagreement over a displayed tag or the relevance of the retrieved result [7], [8], [9], [10]; (iii) user profiling: a user's personal preferences can be detected based on her reactions to retrieved data and be used for re-ranking the results; (iv) content summarization: highlight detection is also possible using implicit feedbacks from the users [11], [12].…”
Section: Introductionmentioning
confidence: 99%