2012
DOI: 10.1109/t-affc.2012.2
|View full text |Cite
|
Sign up to set email alerts
|

Physiological-Based Affect Event Detector for Entertainment Video Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
33
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 64 publications
(34 citation statements)
references
References 16 publications
1
33
0
Order By: Relevance
“…Indeed, the related works are dissimilar in the modality to recognize the affective states that can be natural or induced. Thus, emotion can be evoked by watching affective movies [20], video clips [21], since playing a video game [22], driving a car or listening to music [23] [24]. Moreover, the emotion can be defined into different models: the first is Eckman's model that is based on universal emotional expressions to present out six discrete basic emotions: Happiness, Sadness, Surprise, Fear, Anger and Disgust [25].…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, the related works are dissimilar in the modality to recognize the affective states that can be natural or induced. Thus, emotion can be evoked by watching affective movies [20], video clips [21], since playing a video game [22], driving a car or listening to music [23] [24]. Moreover, the emotion can be defined into different models: the first is Eckman's model that is based on universal emotional expressions to present out six discrete basic emotions: Happiness, Sadness, Surprise, Fear, Anger and Disgust [25].…”
Section: Introductionmentioning
confidence: 99%
“…Experimental results on a dataset of 64 scenes from eight movies watched by eight participants demonstrated that in addition to video features, subjects' physiological responses (i.e., GSR, EMG, blood pressure, respiration, and ST) could provide affective ranking to video scenes. Fleureau et al [97] proposed a two-stage affect detector for video viewing and entertainment applications. They first recognized affective events in the videos, and then use Gaussian processes to classify the video segments as positive or negative using GSR, heart rate, and electromyogram.…”
Section: Implicit Video Affective Content Analysis Using Physiologicamentioning
confidence: 99%
“…In addition, they used different methods for emotion dimension prediction, such as linear relevance vector machine [91], and emotion category classification, such as decision tree [13], Gaussian process classifiers [97] and SVM [96], [106] etc.…”
Section: Implicit Video Affective Content Analysis Using Physiologicamentioning
confidence: 99%
“…Chêne et al [27] used physiological linkage between differ- Petridis and Pantic proposed a method for tagging videos for the level of hilarity by analyzing user's laughter [5]. Different types of laughter can be an indicator of the level of hilarity of multimedia content.…”
Section: State Of the Artmentioning
confidence: 99%