Music induces different kinds of emotions in listeners. Previous research on music and emotions discovered that different music features can be used for classifying how certain music can induce emotions in an individual. We propose a method for collecting electroencephalograph (EEG) data from subjects listening to emotion-inducing music. The EEG data is used to continuously label high-level music features with continuous-valued emotion annotations using the emotion spectrum analysis method. The music features are extracted fromMIDI files using a windowing technique. We highlight the results of two emotion models for stress and relaxation which were constructed using C4.5. Evaluations of the models using 10-fold cross validation give promising results with an average relative absolute error of 6.54% using a window length of 38.4 seconds.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.