It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).
The neural mechanisms of music listening and appreciation are not yet completely understood. Based on the apparent relationship between the beats per minute (tempo) of music and the desire to move (for example feet tapping) induced while listening to that music it is hypothesised that musical tempo may evoke movement related activity in the brain. Participants are instructed to listen, without moving, to a large range of musical pieces spanning a range of styles and tempos during an electroencephalogram (EEG) experiment. Event-related desynchronisation (ERD) in the EEG is observed to correlate significantly with the variance of the tempo of the musical stimuli. This suggests that the dynamics of the beat of the music may induce movement related brain activity in the motor cortex. Furthermore, significant correlations are observed between EEG activity in the alpha band over the motor cortex and the bandpower of the music in the same frequency band over time. This relationship is observed to correlate with the strength of the ERD, suggesting entrainment of motor cortical activity relates to increased ERD strength.
Affective algorithmic composition is a growing field that combines perceptually motivated affective computing strategies with novel music generation. This paper presents work towards the development of one application. The long-term goal is to develop a responsive and adaptive system for inducing affect that is both controlled and validated by biophysical measures. Literature documenting perceptual responses to music identifies a variety of musical features and possible affective correlations, but perceptual evaluations of these musical features for the purposes of inclusion in a music generation system are not readily available. A discrete feature, rhythmic density (a function of note duration in each musical bar, regardless of tempo), was selected on the basis that it was shown to be well-correlated with affective responses in existing literature. A prototype system was then designed to produce controlled degrees of variation in rhythmic density via a transformative algorithm. A two-stage perceptual evaluation of a stimulus set created by this prototype was then undertaken. First, listener responses from a pairwise scaling experiment were analysed via Multidimensional Scaling Analysis (MDS). The statistical best-fit solution was rotated such that stimuli with the largest range of variation were placed across the horizontal plane in 2 dimensions. In this orientation, stimuli with deliberate variation in rhythmic density appeared further from the source material used to generate them than from stimuli generated by random permutation. Second, the same stimulus set was then evaluated according to the order suggested in the rotated 2-dimensional solution, in a verbal elicitation experiment. A verbal protocol analysis (VPA) found that the listener perception of the stimulus set varied in at least two commonly understood emotional descriptors, which might be considered affective correlates of rhythmic density. Thus, these results further corroborate previous studies wherein musical parameters are monitored for changes in emotional expression, and that some similarly, parameterised control of perceived emotional content in an affective algorithmic composition system can be achieved, and provide a methodology for evaluating and including further possible musical features in such a system. Some suggestions regarding the test procedure and analysis techniques are also documented here.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.