Development of EEG-based brain computer interface (BCI) methods has largely focused on creating a communication channel for subjects with intact cognition but profound loss of motor control from stroke or neurodegenerative disease that allows such subjects to communicate by spelling out words on a personal computer. However, other important human communication channels may also be limited or unavailable for handicapped subjects-direct non-linguistic emotional communication by gesture, vocal prosody, facial expression, etc.. We report and examine a first demonstration of a musical 'emotion BCI' in which, as one element of a live musical performance, an able-bodied subject successfully engaged the electronic delivery of an ordered sequence of five music two-tone bass frequency drone sounds by imaginatively re-experiencing the human feeling he had spontaneously associated with the sound of each drone sound during training sessions. The EEG data included activities of both brain and non-brain sources (scalp muscles, eye movements). Common Spatial Pattern classification gave 84% correct pseudo-online performance and 5-of-5 correct classification in live performance. Re-analysis of the training session data including only the brain EEG sources found by multiple-mixture Amica ICA decomposition achieved five-class classification accuracy of 59-70%, confirming that different voluntary emotion imagination experiences may be associated with distinguishable brain source EEG dynamics.
We engineered an interactive music system that influences a user's breathing rate to induce a relaxation response. This system generates ambient music containing periodic shifts in loudness that are determined by the user's own breathing patterns. We evaluated the efficacy of this music intervention for participants who were engaged in an attention-demanding task, and thus explicitly not focusing on their breathing or on listening to the music. We measured breathing patterns in addition to multiple peripheral and cortical indicators of physiological arousal while users experienced three different interaction designs: (1) a "Fixed Tempo" amplitude modulation rate at six beats per minute; (2) a "Personalized Tempo" modulation rate fixed at 75% of each individual's breathing rate baseline, and (3) a "Personalized Envelope" design in which the amplitude modulation matches each individual's breathing pattern in real-time. Our results revealed that each interactive music design slowed down breathing rates, with the "Personalized Tempo" design having the largest effect, one that was more significant than the non-personalized design. The physiological arousal indicators (electrodermal activity, heart rate, and slow cortical potentials measured in EEG) showed concomitant reductions, suggesting that slowing users' breathing rates shifted them towards a more calmed state. These results suggest that interactive music incorporating biometric data may have greater effects on physiology than traditional recorded music.
We have developed a method for studying musical engagement using simple expressive rhythmic "conducting" gestures matching the musical pulse. Expert and nonexpert participants attempted to communicate the feeling of heard musical excerpts using simple rhythmic U-shaped hand/arm "conducting" gestures that animated, in real time, the movement of a spot of light on a video display, while body motion capture and high-density electroencephalography (EEG) recorded their hand/arm movements and brain activity. The animations were intended to focus and simplify the conductors' hand movements. A Web-based pool of viewers then rated the recorded moving-spot animations using a musical affect rating scale. Affective ratings of the musical excerpts by conductor and viewer groups were correlated. Statistically significant differences were found in both the motion capture and the EEG data between the fully "engaged" condition described earlier and a "less musically engaged" condition in which conductors concurrently performed a mental dual distractor task. A theta/alpha-band EEG power increase in or near the right parietal-temporal-occipital junction before onsets of left-to-right swings appeared only in the fully engaged condition. Our results support: (1) Musical feeling and engagement can be communicated by rhythmic musical gestures alone, even after their reduction to a simple point-light display. (2) EEG brain activity in or near the right parietal-temporal-occipital junction appears to support affective gestural communication. (3) Expressive rhythmic gesturing does not seem to interfere with musical engagement, a possible advantage over other methods used to measure emotional engagement during music listening.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.