2016
DOI: 10.24846/v25i1y201601
|View full text |Cite
|
Sign up to set email alerts
|

An Affect-Based Multimodal Video Recommendation System

Abstract: People watching a video can almost always suppress their speech but they cannot suppress their body language and manage their physiological and behavioral parameters. Affects/emotions, sensory processing, actions/motor behavior and motivation link to the limbic system responsible for instinctive and instantaneous human reactions to their environment or to other people. Limbic reactions are immediate, sure, time-tested and occur among all people. Such reactions are highly spontaneous and reflect the video viewe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 8 publications
0
1
0
Order By: Relevance
“…Facial expressions/features Soleymani and Pantic, 2012;Zhao et al, 2013;Boughrara et al, 2016;Choi et al, 2016;Kaklauskas et al, 2016;Mahata et al, 2017;Diaz et al, 2018;Fonnegra, 2018;Hewitt and Gunes, 2018;Kaklauskas et al, 2018;Bohlin et al, 2019;Soni et al, 2019;De Pessemier et al, 2020;Mishra et al, 2020;Leite et al, 2022 Skin-estimated pulse/heart rate Dabas et al, 2018;Diaz et al, 2018;Shu et al, 2018a;Bohlin et al, 2019;Soni et al, 2019;Ðor dević Čegar et al, 2020Mood Winoto and Tang, 2010EDA Ðor dević Čegar et al, 2020BA Alhagry, 2017Liu et al, 2017;Kwon et al, 2018;Ogawa et al, 2018;Yang et al, 2019;Ðor dević Čegar et al, 2020 User interactions Niu et al, 2013;Niu et al, 2016GSR Kwon et al, 2018 Body gestures Hassib et al, 2017 Perceived connotative properties Martha and Larson, 2013;Zhang and Zhang, 2017 Movie reviews/comments/web recordings Mulholland et al, 2017;Yenter, 2017;Tripathi et al, 2019;Krishnamurthy, 2020;Pan et al, 2020;Breitfuss et al...…”
Section: Audience Responses Publicationsmentioning
confidence: 99%
“…Facial expressions/features Soleymani and Pantic, 2012;Zhao et al, 2013;Boughrara et al, 2016;Choi et al, 2016;Kaklauskas et al, 2016;Mahata et al, 2017;Diaz et al, 2018;Fonnegra, 2018;Hewitt and Gunes, 2018;Kaklauskas et al, 2018;Bohlin et al, 2019;Soni et al, 2019;De Pessemier et al, 2020;Mishra et al, 2020;Leite et al, 2022 Skin-estimated pulse/heart rate Dabas et al, 2018;Diaz et al, 2018;Shu et al, 2018a;Bohlin et al, 2019;Soni et al, 2019;Ðor dević Čegar et al, 2020Mood Winoto and Tang, 2010EDA Ðor dević Čegar et al, 2020BA Alhagry, 2017Liu et al, 2017;Kwon et al, 2018;Ogawa et al, 2018;Yang et al, 2019;Ðor dević Čegar et al, 2020 User interactions Niu et al, 2013;Niu et al, 2016GSR Kwon et al, 2018 Body gestures Hassib et al, 2017 Perceived connotative properties Martha and Larson, 2013;Zhang and Zhang, 2017 Movie reviews/comments/web recordings Mulholland et al, 2017;Yenter, 2017;Tripathi et al, 2019;Krishnamurthy, 2020;Pan et al, 2020;Breitfuss et al...…”
Section: Audience Responses Publicationsmentioning
confidence: 99%
“…These automated systems can recognize and interpret the emotional states of a person through physical and physiological measures. Physical conditions represent communicative signals such as facial expressions [46,170,171], speech detection (speech) [47,172], body gestures [47,173], and eyetracking when viewing interactive content [32,174]. Whereas the physiological measurements involve the recording of bodily variations such as the change in temperature and the increase in blood pressure [1][2][3].…”
Section: Emotion Recognitionmentioning
confidence: 99%