Empirical studies of emotions in music have described the role of individual musical features in recognizing particular emotions. However, no attempts have been made as yet to establish if there is a link between particular emotions and a specific genre. Here this is investigated by analysing nine separate datasets that represent categories ranging from classical (three sets), and film music (two), to popular music (two), and mixed genre (two). A total of 39 musical features were extracted from the audio. Models were then constructed from these to explain self-reports of valence and arousal, by using multiple and Random Forest regression. The models were fully validated across the datasets, suggesting low generalizability between the genres for valence (16% variance was accounted for) and moderately good generalizability between the genres for arousal (43%). In contrast, the generalizability within genres was considerably higher (43% and 62% respectively), which suggests that emotions, especially those that express valence, operate differently depending on the musical genre. The most reliable musical features of affects across genres were identified, yielding a ranked set of features most likely to operate across the genre. In conclusion, the implications of the findings, and the genre-specificity of emotions in music are discussed.