1Pitch is a primary perceptual dimension of sounds and is crucial in music and speech perception. When listening 2 to melodies, most humans encode the relations between pitches into memory using an ability called relative pitch 3 (RP). A small subpopulation, almost exclusively musicians, preferentially encode pitches using absolute pitch 4 (AP): the ability to identify the pitch of a sound without an external reference. In this study, we recruited a large 5 sample of musicians with AP (AP musicians) and without AP (RP musicians). The participants performed a pitch-6 processing task with a Listening and a Labeling condition during functional magnetic resonance imaging. General 7 linear model analysis revealed that while labeling tones, AP musicians showed lower blood oxygenation level 8 dependent (BOLD) signal in the inferior frontal gyrus and the presupplementary motor area -brain regions 9 associated with working memory, language functions, and auditory imagery. At the same time, AP musicians 10 labeled tones more accurately suggesting that AP might be an example of neural efficiency. In addition, using 11 multivariate pattern analysis, we found that BOLD signal patterns in the inferior frontal gyrus and the 12 presupplementary motor area differentiated between the groups. These clusters were similar, but not identical 13 compared to the general linear model-based clusters. Therefore, information about AP and RP might be present 14 on different spatial scales. While listening to tones, AP musicians showed increased BOLD signal in the right 15 planum temporale which may reflect the matching of pitch information with internal templates and corroborates 16 the importance of the planum temporale in AP processing.
Humans with absolute pitch (AP) are able to effortlessly name the pitch class of a sound without an external reference. The association of labels with pitches cannot be entirely suppressed even if it interferes with task demands. This suggests a high level of automaticity of pitch labeling in AP. The automatic nature of AP was further investigated in a study by Rogenmoser et al. (2015). Using a passive auditory oddball paradigm in combination with electroencephalography, they observed electrophysiological differences between musicians with and without AP in response to piano tones. Specifically, the AP musicians showed a smaller P3a, an event-related potential (ERP) component presumably reflecting early attentional processes. In contrast, they did not find group differences in the mismatch negativity (MMN), an ERP component associated with auditory memory processes. They concluded that early cognitive processes are facilitated in AP during passive listening and are more important for AP than the preceding sensory processes. In our direct replication study on a larger sample of musicians with (n = 54, 27 females, 27 males) and without (n = 50, 24 females, 26 males) AP, we successfully replicated the non-significant effects of AP on the MMN. However, we could not replicate the significant effects for the P3a. Additional Bayes factor analyses revealed moderate to strong evidence (Bayes factor > 3) for the null hypothesis for both MMN and P3a. Therefore, the results of this replication study do not support the postulated importance of cognitive facilitation in AP during passive tone listening.
Pitch is a fundamental attribute of sounds and yet is not perceived equally by all humans. Absolute pitch (AP) musicians perceive, recognize, and name pitches in absolute terms, whereas relative pitch (RP) musicians, representing the large majority of musicians, perceive pitches in relation to other pitches. In this study, we used electroencephalography (EEG) to investigate the neural representations underlying tone listening and tone labeling in a large sample of musicians (n = 105). Participants performed a pitch processing task with a listening and a labeling condition during EEG acquisition. Using a brain-decoding framework, we tested a prediction derived from both theoretical and empirical accounts of AP, namely that the representational similarity of listening and labeling is higher in AP musicians than in RP musicians. Consistent with the prediction, time-resolved single-trial EEG decoding revealed a higher representational similarity in AP musicians during late stages of pitch perception. Time-frequency-resolved EEG decoding further showed that the higher representational similarity was present in oscillations in the theta and beta frequency bands. Supplemental univariate analyses were less sensitive in detecting subtle group differences in the frequency domain. Taken together, the results suggest differences between AP and RP musicians in late pitch processing stages associated with cognition, rather than in early processing stages associated with perception.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.