Recently, neural network based tag recommendation has gained much attention. Most of these methods regard tag recommendation as multi-label classification. They treat different tags as individual categories with fixed labels, without considering the rich relations among tags. Moreover, using a fixed number of categories cannot tackle dynamic, changing tags with ongoing topics. In this paper, we transform tag recommendation into a word-based text generation problem and introduce a sequence-to-sequence model. For efficiently modeling the semantic dependencies among tags in tag sequence and the strong sequential relations among the tag-words, we propose an essential sequence-to-sequence model, named LSTM-Attention. The model inherits the advantages of recurrent network based encoder for sequential modeling and attention based decoder for learning relations globally. In addition, as a text generation method, the proposed model is able to generate unseen tags, which is more applicable and flexible to real scenarios. Extensive experimental results on two datasets, i.e., Zhihu and Weibo, clearly illustrate the proposed model significantly outperforms other state-of-the-art text classification based methods and well demonstrate its advantage of handling unseen tags.