Proceedings of the 29th ACM International Conference on Information &Amp; Knowledge Management 2020
DOI: 10.1145/3340531.3411938
|View full text |Cite
|
Sign up to set email alerts
|

Query-to-Session Matching: Do NOT Forget History and Future during Response Selection for Multi-Turn Dialogue Systems

Abstract: Given a user query, traditional multi-turn retrieval-based dialogue systems first retrieve a set of candidate responses from the historical dialogue sessions. Then the response selection models select the most appropriate response to the given query. However, previous work only considers the matching between the query and the response but ignores the informative dialogue session in which the response is located. Nevertheless, this session, composed of the response, the response's history and the response's fut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…In recent years, sequence-to-sequence (seq2seq) [61] based neural networks have been proved effective in generating a fluent sentence. The seq2seq model is originally proposed for machine translation and later adapted to various natural language generation tasks, such as text summarization [10,18,19,22,25,41,48,69,71] and dialogue generation [6,17,20,21,40,50,64,81,85,86]. Rush et al [53] apply the seq2seq mechanism with attention model to text summarization field.…”
Section: Text Generation Methodsmentioning
confidence: 99%
“…In recent years, sequence-to-sequence (seq2seq) [61] based neural networks have been proved effective in generating a fluent sentence. The seq2seq model is originally proposed for machine translation and later adapted to various natural language generation tasks, such as text summarization [10,18,19,22,25,41,48,69,71] and dialogue generation [6,17,20,21,40,50,64,81,85,86]. Rush et al [53] apply the seq2seq mechanism with attention model to text summarization field.…”
Section: Text Generation Methodsmentioning
confidence: 99%
“…Close related to conversational recommendation and interactive large language model (Ope-nAI, 2022), it has extensive application in the commercial area. With the recent huge success of PLMs (Devlin et al, 2018;Liu et al, 2019), posttraining PLMs with diverse self-supervised tasks become a popular trend and achieve impressive performance (Xu et al, 2020;Gu et al, 2020;Whang et al, 2021;Fu et al, 2023).…”
Section: Related Workmentioning
confidence: 99%