2009 IEEE International Conference on Multimedia and Expo 2009
DOI: 10.1109/icme.2009.5202572
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting genre for music emotion classification

Abstract: Genre and emotion have been applied to content-based music retrieval and organization; however, the intrinsic correlation between them has not been explored. In this paper we present a statistical association analysis to examine such intrinsic correlation and propose a two-layer scheme that exploits the correlation for emotion classification. Significant improvement of classification accuracy over the traditional single-layer scheme is obtained.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
11
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 5 publications
1
11
0
Order By: Relevance
“…Nevertheless, knowing the genre of a song can be helpful for mood classification as shown e.g. by or Lin, Yang, Chen, Liao, and Ho (2009), and therefore genre information should be added to the feature set.…”
Section: Genrementioning
confidence: 99%
“…Nevertheless, knowing the genre of a song can be helpful for mood classification as shown e.g. by or Lin, Yang, Chen, Liao, and Ho (2009), and therefore genre information should be added to the feature set.…”
Section: Genrementioning
confidence: 99%
“…Lee et al proposed a convolutional attention networks model to learn the features of both speech and text data, and the proposed model obtained better results for classifying emotions in the benchmark datasets [37]. Some studies considered to use other mid-level or high-level audio features, such as chord progression and genre metadata in [27,[38][39][40]. Lin et al presented the association between genre and emotion, and proposed a two-layer scheme that could exploit the correlation for emotion classification [38].…”
Section: Introductionmentioning
confidence: 99%
“…Some studies considered to use other mid-level or high-level audio features, such as chord progression and genre metadata in [27,[38][39][40]. Lin et al presented the association between genre and emotion, and proposed a two-layer scheme that could exploit the correlation for emotion classification [38]. Schuller et al [40] incorporated genre, ballroom dance style, chord progression and lyrics to classify music emotion, and the experiments showed that most of the considered factors would improve the classification accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Training machines to recognize emotions from music resembles training machines to recognize genres used by music: the significant amount of subjectivity [5]; the debates about whether genre is even in the music signal [6]; the assumed existence of abstract categories that are difficult if not impossible to systematize, to define directly with a set of clear unambiguous rules, or even to define indirectly by exemplars that are indisputably representative [7]; ground truths that are difficult if not impossible to generate for datasets that are deceptively simple to assemble; and the difficulty of evaluation [8]. Genre and mood have been shown to be correlated in some cases [1,9]; and some systems proposed for MER are just adapted from MGR [1,2]. Thus, since MGR is one of the most studied areas in music information research [2,10] -MGR has been called its "flagship application" [11] -we argue that the challenges MGR research faces in evaluation inform those faced in MER.…”
Section: Introductionmentioning
confidence: 99%