Cultural processes of change bear many resemblances to biological evolution. Identifying underlying units of evolution, however, has remained elusive in non-biological domains, especially in music, where music evolution is often considered to be a loose metaphor. Here we introduce a general framework to jointly identify underlying units and their associated evolutionary processes using a latent modeling approach. Musical styles and principles of organization in dimensions such as harmony, melody and rhythm can be modeled as partly following an evolutionary process. Furthermore, we propose that such processes can be identified by extracting latent evolutionary signatures from musical corpora, analogous to the analysis of mutational signatures in evolutionary genomics, particularly in cancer. These latent signatures provide a generative code for each song, which allows us to identify broad trends and associations between songs and genres. To provide a test case, we analyze songs from the McGill Billboard dataset, in order to find popular chord transitions (k-mers), associate them with music genres and identify latent evolutionary signatures related to these transitions. First, we use a generalized singular value decomposition to identify associations between songs, motifs and genres, and then we use a deep generative model based on a Variational Autoencoder (VAE) framework to extract a latent code for each song. We tie these latent representations together across the dataset by incorporating an energy-based prior, which encourages songs close in evolutionary space to share similar codes. Using this framework, we are able to identify broad trends and genre-specific features of the dataset. Further, our evolutionary model outperforms non-evolutionary models in tasks such as period and genre prediction. To our knowledge, ours is the first computational approach to identify and quantify patterns of music evolution de novo.