Three experiments assessed the hypothesis that cognitive benefits associated with exposure to music only occur when the perceived emotion expression of the music and the participant's affective state match. Experiment 1 revealed an affect-matching pattern modulated by gender when assessing high-arousal states of opposite valence (happy/angry) in an adult sample (n=94) in which mood classification was based on self-report, and affective valence in music was differentiated by mode and other expressive cues whilst keeping tempo constant (139 BPM). The affect-matching hypothesis was then tested in two experiments with children using a moodinduction procedure: Experiment 2 tested happy/angry emotions with, respectively, 3-5-(n=40) and 6-9-year-old (n=40) children, and Experiment 3 compared happy/sad emotions (i.e., states differing both for valence and arousal profiles) with 3-5-year-old children (n=40), using music pieces differentiated also by fast vs. slow tempo. While young children failed to discriminate systematically between fast tempo music conveying different emotions, they did display cognitive benefits from exposure to affect-matching music when both valence (e.g., mode) and arousal level (e.g., tempo) differentiated the musical excerpts, with no gender effects. Keywords:Arousal, Central executive, Child development, Mozart Effect 3In music psychology, the beneficial effects of music on psychosocial functioning, and the role played by emotion and mood in producing those effects, are currently important topics of research (MacDonald, Kreutz, & Mitchell, 2012). One area of experimental research that has received particular attention is the study of the 'Mozart effect', a term that refers to findings of temporary improvements in cognitive performance after listening to music (for a review, see Schellenberg, 2012). Doubts about the theoretical underpinnings of the original study that spawned research on the 'Mozart effect' (Rauscher, Shaw, & Ky, 1993) led Schellenberg and collaborators to formulate an alternative theory called the 'arousal and mood hypothesis', which proposed that temporary improvements in cognitive performance were not a direct product of listening to music, but were rather a product of arousal and positive affect induced by listening to music. According to this hypothesis, the 'Mozart effect' is neither Mozart-specific nor music-specific: any piece of music, or any nonmusical stimulus, that induces arousal and positive affect can be used to produce temporary improvements in cognitive performance. This proposal is consistent with a wide range of studies finding that positive affect enhances performance on cognitive tasks (e.g. see Isen, 2008). A series of studies conducted by Schellenberg and collaborators supported the arousal and mood hypothesis (Husain, Thompson, & Schellenberg, 2002;Schellenberg, Nakata, Hunter, & Tamoto, 2007;Thompson, Schellenberg, & Husain, 2001; for a review, see Schellenberg, 2012).A key assumption made by Schellenberg and others is that happy-sounding music ...
Research has shown inconsistent results concerning the ability of young children to identify musical emotion. This study explores the influence of the type of musical performance (vocal vs. instrumental) on children's affect identification. Using an independent-group design, novel child-directed music was presented in three conditions: instrumental, vocal-only, and song (instrumental plus vocals) to 3-to 6-year-olds previously screened for language development (n = 76). A forced-choice task was used in which children chose a face expressing the emotion matching each musical track. All performance conditions comprised 'happy' (major mode/fast tempo) and 'sad' (minor mode/slow tempo) tracks. Nonsense syllables rather than words were used in the vocals in order to avoid the influence of lyrics on children's decisions. The results showed that even the younger children were able to identify correctly the intended emotion in music, although 'happy' music was more readily recognized and recognition appeared facilitated in the instrumental condition. Performance condition interacted with gender.
In music psychology, studies of emotion regulation are typically about the self-regulation of emotion in, primarily, music listeners. What is missing from the literature is a model of emotion regulation that accounts for both intrinsic (self-generated) and extrinsic (other-generated) processes, which could inform an understanding of how music can be used by one person to extrinsically influence emotion regulation in another. To address this gap, the present article develops a theoretical model of musical communication. Musical communication is conceptualized as a goal-directed process in which one musical participant (the therapist) influences the emotion regulation strategies employed by another (the client). The theory proposes that an important goal of musical communication is to encourage clients to regulate their emotional responses in ways that enhance rather than diminish their capacity to sustain attention. This goal is proposed to be pursued by enabling clients to exercise their ability to sustain attention, and by promoting three processes that serve to enhance their attentional capacity: the utilization of emotional arousal to facilitate sensory processing, the facilitation of emotional motor tendencies, and the reduction of behavioural uncertainty. These propositions are analysed and discussed with a focus on emotion regulation in worry-prone clients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.