“…In recent years, sequence-to-sequence (seq2seq) [61] based neural networks have been proved effective in generating a fluent sentence. The seq2seq model is originally proposed for machine translation and later adapted to various natural language generation tasks, such as text summarization [10,18,19,22,25,41,48,69,71] and dialogue generation [6,17,20,21,40,50,64,81,85,86]. Rush et al [53] apply the seq2seq mechanism with attention model to text summarization field.…”