2012
DOI: 10.1109/t-affc.2011.39
|View full text |Cite
|
Sign up to set email alerts
|

Toward E-Motion-Based Music Retrieval a Study of Affective Gesture Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
10
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 29 publications
2
10
0
Order By: Relevance
“…The results showed that subjects are able to consistently relate the musical excerpts with the reaction forces, even if the FEI metaphor seems to be more suitable for describing expressive cues in simple musical excerpts, where the expressive content is mainly related to performance cues, than in complex musical stimuli, where musical structure is more relevant. Amelynck et al (2012) asked the subjects to express the affective character of a set of musical fragments by arm movements, while listening to the fragments. The subjects' arm motion was captured using a wireless handheld device equipped with 3D inertial sensors.…”
Section: Music and Actionmentioning
confidence: 99%
“…The results showed that subjects are able to consistently relate the musical excerpts with the reaction forces, even if the FEI metaphor seems to be more suitable for describing expressive cues in simple musical excerpts, where the expressive content is mainly related to performance cues, than in complex musical stimuli, where musical structure is more relevant. Amelynck et al (2012) asked the subjects to express the affective character of a set of musical fragments by arm movements, while listening to the fragments. The subjects' arm motion was captured using a wireless handheld device equipped with 3D inertial sensors.…”
Section: Music and Actionmentioning
confidence: 99%
“…Furthermore, we assume that rhythmicity is more related to arousal whereas expressive properties are more related to valence which has been shown to me more diicult to predict by movement (cf. Amelynck et al [2012] or Irrgang and Egermann [2016]). Another restriction to predict complex and tonal properties stems from the fact that we did not model all changes over time in accelerometer motion data and music features since both were averaged over time.…”
Section: Discussionmentioning
confidence: 99%
“…In the beginning, participants chose one of their own songs to warm up and get familiar with the study's procedure. They were 1 www.optitrack.com 2 www.airdroid.com instructed as follows: łMove the smartphone to the music. You can move the rest of the body intuitively along with it but keep in mind that the characteristic motion must be captured by the phone.…”
Section: Main Studymentioning
confidence: 99%
See 1 more Smart Citation
“…Gesture Controlled MP3 player [7] developed using image processing in mat lab and edge detection algorithm. E-Motion based music retrieval [1] presented a foundation for a motion based user interface for music retrieval using wireless handheld device equipped with inertial sensors [9]. A system is developed for controlling music player [2] using the webcam for capturing the hand gestures, identifying them and finally accessing the music player.…”
Section: Introductionmentioning
confidence: 99%