2017
DOI: 10.1007/s00530-017-0542-0
|View full text |Cite
|
Sign up to set email alerts
|

Affective content analysis of music emotion through EEG

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(15 citation statements)
references
References 36 publications
0
15
0
Order By: Relevance
“…Therefore, studies have used biophysical signals or metrics (such as electromyography and skin conductance [43]) to recognize user emotions. Some studies exploit the technologies used in cognitive neuroscience to represent user emotion states through electroencephalography (EEG) [44,45] in music-related activities. Ayata et al even built an emotion-based music recommendation system by tracking user emotion states with wearable physiological sensors [17].…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, studies have used biophysical signals or metrics (such as electromyography and skin conductance [43]) to recognize user emotions. Some studies exploit the technologies used in cognitive neuroscience to represent user emotion states through electroencephalography (EEG) [44,45] in music-related activities. Ayata et al even built an emotion-based music recommendation system by tracking user emotion states with wearable physiological sensors [17].…”
Section: Related Workmentioning
confidence: 99%
“…There was also musical condition, and positive mood was elicited by La Primavera (Spring) from The Four Seasons by Vivaldi [21]. Elicitation of emotion by music is also very common [28] and reveals that mood influences attentional networks [23], or evokes genuine basic emotions such as happiness or sadness [10].…”
Section: Mood Induction and Measuresmentioning
confidence: 99%
“…The frameworks of emotion analysis based on EEG-multimodal data fusion have similarity among existing studies. Affective images, music [50], videos [51]- [57] have been selected as the stimuli to induce emotions. Users' emotion states are measured by physiological and EEG sensors and the emotion annotations are rated by users in the experiment accordingly.…”
Section: Emotion Recognition Based On Eeg-multimodal Data Fusionmentioning
confidence: 99%
“…In the recent study performed by Jia et al in 2018, music features and EEG features were combined to fuse the dataset and build the personalized model for music emotion recognition, which achieved an average mse of 0.176 [50]. L. Granados et al applied the deep learning approach on the EEG and galvanic skin response dataset and achieved a classification accuracy of 71% for arousal and 75% for valence [59].…”
Section: Emotion Recognition Based On Eeg-multimodal Data Fusionmentioning
confidence: 99%