Music can convey basic emotions, such as joy and sadness, and more complex ones, such as tenderness or nostalgia. Its effects on emotion regulation and reward have attracted much attention in cognitive and affective neuroscience. Understanding the underlying neural mechanisms of music-evoked emotions could guide the development of novel technological and individually-tuned neurorehabilitation music-based therapies. This study aims to unravel the relationship between the classification of music excerpts regarding perceived affective states and their associated neural correlates, as measured by fMRI. We used valence and arousal to classify both the stimuli and the affective states perceived by the participants. We acquired fMRI data from 20 participants while listening to 96 musical excerpts a priori classified into four quadrants, considering the valence-arousal model. We first characterized the neural correlates resulting from a GLM analysis of the quadrants defined by valence (positive, negative) and arousal (high, low). Our results highlight the role of neocortical regions, most noticeably the music-specific sub-regions of the auditory cortex and thalamus, as well as regions from the reward network such as the amygdala. Using multivoxel activity patterns corresponding to the four quadrants representation of core affect, we were able to create a computational model that decodes the quadrant corresponding to the music with significant accuracy, well above a stringent chance level. We further analyzed a set of musical features using regression analysis and explored how they relate to brain activity in valence-, arousal-, reward-, and auditory-related ROIs. The results emphasize the role of expressive features in emotion-related networks. These results contribute to the definition of a relation between music and the neural substrate of music listening and emotions, which is key to developing novel music-based neurorehabilitation strategies.