2021
DOI: 10.1609/aaai.v35i14.17502
|View full text |Cite
|
Sign up to set email alerts
|

Reasoning in Dialog: Improving Response Generation by Context Reading Comprehension

Abstract: In multi-turn dialog, utterances do not always take the full form of sentences (Carbonell 1983), which naturally makes understanding the dialog context more difficult. However, it is essential to fully grasp the dialog context to generate a reasonable response. Hence, in this paper, we propose to improve the response generation performance by examining the model's ability to answer a reading comprehension question, where the question is focused on the omitted information in the dialog. Enlightened by the multi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 48 publications
0
2
0
Order By: Relevance
“…Headline Generation. In recent years, text generation has made impressive progress (Li et al 2019;Chan et al 2019;Liu et al 2020;Xie et al 2020;Chan et al 2020;Chen et al 2021), and headline generation has become a research hotspot in Natural Language Processing. Most existing headline generation works solely focus on summarizing the document.…”
Section: Related Workmentioning
confidence: 99%
“…Headline Generation. In recent years, text generation has made impressive progress (Li et al 2019;Chan et al 2019;Liu et al 2020;Xie et al 2020;Chan et al 2020;Chen et al 2021), and headline generation has become a research hotspot in Natural Language Processing. Most existing headline generation works solely focus on summarizing the document.…”
Section: Related Workmentioning
confidence: 99%
“…Open-domain dialogue generation is becoming a research hotspot in the community of natural language processing due to its penitential applications (Li et al 2019;Chen et al 2021b). Generally, in the paradigm of deep neural networks, a large quantity of training data is required for facilitating the convergence of these models.…”
Section: Introductionmentioning
confidence: 99%