1Pitch is a fundamental attribute of sounds and yet is not perceived equally by all humans. Absolute pitch (AP) 2 musicians perceive, recognize, and name pitches in absolute terms, whereas relative pitch (RP) musicians, 3representing the large majority of musicians, perceive pitches in relation to other pitches. In this study, we used 4 electroencephalography (EEG) to investigate the neural representations underlying tone listening and tone labeling 5 in a large sample of musicians (n = 105). Participants performed a pitch processing task with a listening and a 6 labeling condition during EEG acquisition. Using a brain-decoding framework, we tested a prediction derived 7 from both theoretical and empirical accounts of AP, namely that the representational similarity of listening and 8 labeling is higher in AP musicians than in RP musicians. Consistent with the prediction, time-resolved single-trial 9 EEG decoding revealed a higher representational similarity in AP musicians during late stages of pitch perception. 10 Time-frequency-resolved EEG decoding further showed that the higher representational similarity was present in 11 oscillations in the theta and beta frequency bands. Supplemental univariate analyses were less sensitive in detecting 12 subtle group differences in the frequency domain. Taken together, the results suggest differences between AP and 13 RP musicians in late pitch processing stages associated with cognition, rather than in early processing stages 14 associated with perception. 15 Keywords 16 auditory perception, multivariate pattern analysis, representational similarity analysis, decoding, EEG 17