2020
DOI: 10.1109/access.2020.2978551
|View full text |Cite
|
Sign up to set email alerts
|

Enhancements to the Sequence-to-Sequence-Based Natural Answer Generation Models

Abstract: There is a great interest shown by academic researchers to continuously improve the sequence-to-sequence (Seq2Seq) model for natural answer generation (NAG) in chatbots. The Seq2Seq model shows a weakness whereby the model tends to generate answers that are generic, meaningless and inconsistent with the questions. However, a comprehensive literature review on the factors contributing to the weakness and potential solutions are still missing. Therefore, this review article fills the gap by reviewing Seq2Seq bas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(13 citation statements)
references
References 62 publications
0
12
0
1
Order By: Relevance
“…Seq2seq model has proven to be superior to another neural network models in a wide range of natural language generation tasks [40], including machine translation that is related to our task. In this section, we first describe the definition of the TCM prescription generation, then we introduce how to apply translator model in the prescription prediction, at last we show how to enhance the model to generate more diverse herbs in the specific setting of this task by introducing attention and coverage mechanism.…”
Section: Methodsmentioning
confidence: 98%
“…Seq2seq model has proven to be superior to another neural network models in a wide range of natural language generation tasks [40], including machine translation that is related to our task. In this section, we first describe the definition of the TCM prescription generation, then we introduce how to apply translator model in the prescription prediction, at last we show how to enhance the model to generate more diverse herbs in the specific setting of this task by introducing attention and coverage mechanism.…”
Section: Methodsmentioning
confidence: 98%
“…Bu bağlamda, dizi cümledeki kelimelere karşılık gelen semboller listesidir. Yaygın olarak makine çevirisi, diyalog sistemleri, soru cevaplama ve metin özetleme gibi alanlarda büyük başarı elde etmiştir [32]- [34]. Genellikle tekrarlayan bir derin öğrenme ağı ve bir dikkat bileşeni içermektedir.…”
Section: Seq2seq (Seq2seq)unclassified
“…Sequence-to-Sequence learning is a part of ML and a method of neural networks that is mostly utilized in language processing models [17][18][19][20][21][22][23]. It can be implemented with recurrent neural networks (RNNs) using encoder-decoder based machine interpretation that maps an input sequence to a yield of a succession of output sequence with a tag and consideration esteem.…”
Section: Introductionmentioning
confidence: 99%