2006
DOI: 10.1007/11610113_70
|View full text |Cite
|
Sign up to set email alerts
|

A Color Bar Based Affective Annotation Method for Media Player

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…For instance, in studies on affective communication in human-computer interaction (e.g., Tajadura-Jiménez and Västfjäll, 2008), the EmojiGrid could be used to enable users to repeatedly report perceived affect (e.g., by projecting a pointer-controlled beam on the grid). Such an application could also be useful for the affective annotation of multimedia (Chen et al, 2007;Soleymani et al, 2008;Runge et al, 2016;Suhaimi et al, 2018), for personalized affective multimedia retrieval (e.g., through query-by-emoji: Cappallo et al, 2019) or multimedia recommender systems (Hanjalic and Xu, 2005;Xu et al, 2008;Zhang et al, 2010;Lopatovska and Arapakis, 2011;Koelstra et al, 2012;Zhao et al, 2013), for the affective appraisal of multimedia entertainment in real-time (Fleureau et al, 2012), to give affective feedback in serious gaming applications (Anolli et al, 2010), and for affective multimedia generation (e.g., music: Kim and André, 2004). We are currently implementing the EmojiGrid in a multisensory VR environment (the Sensiks Sensory Reality Pod: www.sensiks.com) as an interface for the user to select and adjust the desired multisensory (visual, auditory, tactile, and olfactory) affective experiences.…”
Section: Discussionmentioning
confidence: 99%
“…For instance, in studies on affective communication in human-computer interaction (e.g., Tajadura-Jiménez and Västfjäll, 2008), the EmojiGrid could be used to enable users to repeatedly report perceived affect (e.g., by projecting a pointer-controlled beam on the grid). Such an application could also be useful for the affective annotation of multimedia (Chen et al, 2007;Soleymani et al, 2008;Runge et al, 2016;Suhaimi et al, 2018), for personalized affective multimedia retrieval (e.g., through query-by-emoji: Cappallo et al, 2019) or multimedia recommender systems (Hanjalic and Xu, 2005;Xu et al, 2008;Zhang et al, 2010;Lopatovska and Arapakis, 2011;Koelstra et al, 2012;Zhao et al, 2013), for the affective appraisal of multimedia entertainment in real-time (Fleureau et al, 2012), to give affective feedback in serious gaming applications (Anolli et al, 2010), and for affective multimedia generation (e.g., music: Kim and André, 2004). We are currently implementing the EmojiGrid in a multisensory VR environment (the Sensiks Sensory Reality Pod: www.sensiks.com) as an interface for the user to select and adjust the desired multisensory (visual, auditory, tactile, and olfactory) affective experiences.…”
Section: Discussionmentioning
confidence: 99%
“…While these existing tools require the user to successively explore the entire affective space to find the desired expression each time a response is given, the EmojiGrid provides an instantaneous overview of the affective input space. This feature may be useful for the affective annotation of multimedia ( Runge et al, 2016 ) or personalized affective video retrieval ( Xu et al, 2008 ; Lopatovska and Arapakis, 2011 ), for real-time affective evaluation of entertainment ( Fleureau et al, 2012 ) or as an affective input tool for serious gaming applications ( Anolli et al, 2010 ).…”
Section: Future Researchmentioning
confidence: 99%
“…Since the EmojiGrid provides an instantaneous overview of the affective input space, it enables users to provide continuous affective feedback, for instance in HCI studies by moving a mousecontrolled cursor over the support of the grid. This feature may be useful for the real-time affective annotation of mulsemedia (Runge et al 2016), for personalized affective video retrieval (Lopatovska and Arapakis 2011;Xu et al 2008), for real-time affective evaluation of entertainment (Fleureau et al 2012), or as an affective input tool for serious gaming applications (Anolli et al 2010). A simplified version of the EmojiGrid has already been implemented in the Sensiks Sensory Reality Pod (www.sensiks.com) to enable the user to select a multisensory (visual, auditory, tactile, and olfactory) experience with a prespecified affective quality.…”
Section: Conclusion and Discussionmentioning
confidence: 99%