Abstract. ObjectiveWe aim to develop and evaluate an affective Brain-computer music interface (aBCMI) modulating the affective states of its users. ApproachAn aBCMI is constructed to detect a user's current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithm music composition system and a case-based reasoning system. The system is trained and tested in a longitudinal study on a population of 8 healthy participants, with each participant returning for multiple sessions. Main ResultsThe final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p< 0.01) and modulate its user's affective states significantly above chance level (p< 0.05). SignificanceOur system represents one of the first demonstrations of an online affective braincomputer music interface that is able to accurately detect and respond to user's affective states. Possible applications include use in music therapy and entertainment.
It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).
Previous studies of change blindness have suggested a distinction between detection and localisation of changes in a visual scene. Using a simple paradigm with an array of coloured squares, the present study aimed to further investigate differences in event-related potentials (ERPs) between trials in which participants could detect the presence of a colour change but not identify the location of the change (sense trials), versus those where participants could both detect and localise the change (localise trials). Individual differences in performance were controlled for by adjusting the difficulty of the task in real time. Behaviourally, reaction times for sense, blind, and false alarm trials were distinguishable when comparing across levels of participant certainty. In the EEG data, we found no significant differences in the visual awareness negativity ERP, contrary to previous findings. In the N2pc range, both awareness conditions (localise and sense) were significantly different to trials with no change detection (blind trials), suggesting that this ERP is not dependent on explicit awareness. Within the late positivity range, all conditions were significantly different. These results suggest that changes can be ‘sensed’ without knowledge of the location of the changing object, and that participant certainty scores can provide valuable information about the perception of changes in change blindness.
Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute). The music stimuli consisted of piano excerpts designed to convey the emotion of “peacefulness”. Noise stimuli with an identical acoustic content to the music excerpts were also presented for comparison purposes. The brain activity interactions were characterized with the imaginary part of coherence (iCOH) in the frequency range 1.5–18 Hz (δ, θ, α and lower β) between all pairs of EEG electrodes for the four tempi and the music/noise conditions, as well as a baseline resting state (RS) condition obtained at the start of the experimental task. Our findings can be summarized as follows: (a) there was an ongoing long-range interaction in the RS engaging fronto-posterior areas; (b) this interaction was maintained in both music and noise, but its strength and directionality were modulated as a result of acoustic stimulation; (c) the topological patterns of iCOH were similar for music, noise and RS, however statistically significant differences in strength and direction of iCOH were identified; and (d) tempo had an effect on the direction and strength of motor-auditory interactions. Our findings are in line with existing literature and illustrate a part of the mechanism by which musical stimuli with different tempi can entrain changes in cortical activity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.