Much research has sought to recognize and retrieve music on the basis of emotion labels. These labels are usually obtained from either subjective test or social tags. Researchers use social tags usually either by grouping tags to emotion categories or clusters directly, or by mapping tags to dimensional quadrants simply. Few research work have undertaken semantic analysis on social tags for projecting them into a dimensional emotion space, especially based on recent neural word embedding techniques using large-scale datasets. In this paper, we propose an effective solution to analyse music tag information and represent them in a 2-dimension emotion plane without limiting the corpus to contain only emotion terms. In our solution, we apply neural word embedding methods for tag representation, including Skipgram, Continuous Bag-Of-Words (CBOW) and Global Vectors (GloVe). In our experiment, we compare these methods with traditional Latent Semantic Analysis (LSA) model based on Procrustes Analysis evaluation metrics. The results shows that neural tag embedding methods outperform LSA and represent tags with high approximations with classic circumplex emotion definitions.