2021
DOI: 10.1609/aaai.v35i15.17615
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Auxiliary Reasoning Tasks for Task-oriented Dialog Systems with Meta Cooperative Learning

Abstract: In this paper, we propose a Meta Cooperative Learning (MCL) framework for task-oriented dialog systems (TDSs). Our model consists of an auxiliary KB reasoning task for learning meta KB knowledge, an auxiliary dialogue reasoning task for learning dialogue patterns, and a TDS task (primary task) that aims at not only retrieving accurate entities from KB but also generating natural responses, which are coordinated to achieve collective success in both retrieving accurate KB entities and generating human-like resp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…Specifically, Eric and Manning (2017) employed a key-value retrieval mechanism to retrieve KB knowledge triplets. Other works treat KB and dialogue history equally as triplet memories (Madotto et al, 2018;Wu et al, 2019;Chen et al, 2019b;He et al, 2020a;Qin et al, 2021a). Memory networks (Sukhbaatar et al, 2015) have been applied to model the dependency between related entity triplets in KB (Bordes et al, 2017;Wang et al, 2020) and improves domain scalability (Qin et al, 2020b;Ma et al, 2021).…”
Section: Triplet Representationmentioning
confidence: 99%
“…Specifically, Eric and Manning (2017) employed a key-value retrieval mechanism to retrieve KB knowledge triplets. Other works treat KB and dialogue history equally as triplet memories (Madotto et al, 2018;Wu et al, 2019;Chen et al, 2019b;He et al, 2020a;Qin et al, 2021a). Memory networks (Sukhbaatar et al, 2015) have been applied to model the dependency between related entity triplets in KB (Bordes et al, 2017;Wang et al, 2020) and improves domain scalability (Qin et al, 2020b;Ma et al, 2021).…”
Section: Triplet Representationmentioning
confidence: 99%
“…[9] presented a large number of prior methods and datasets for implementing the dialogue systems. Motivated by the significant achievements of the sequence-to-sequence models in conditional text generation, numerous works have been introduced to produce responses with the seq2seq framework [4][5][6]. These models could be learnt and optimised in the end-to-end way and have good flexibility in generating innovative responses.…”
Section: Open-domain Conversation Systemsmentioning
confidence: 99%
“…With the availability of large-scale conversation data online, much attention has been given to full data-driven neural conversational systems. The sequence-to-sequence model is a typical dialogue generation approach [4][5][6], which employs a recurrent neural network (RNN) encoder to encode the conversation history as a representation, and then feeds this conversation representation into another RNN decoder to generate emotional responses word by word. The conversation systems implemented with the sequence-to-sequence techniques have overwhelmingly established state-of-the-art performances in response generation, since they are learnt in an end-to-end way and can be scale to large-scale corpora, which could generate responses with good quality and flexibility.…”
Section: Introductionmentioning
confidence: 99%
“…Task-oriented Dialogue systems (ToDs) aim to assist users in accomplishing various tasks, such as hotel and restaurant reservations (Luong, Pham, and Manning 2015;). Since the system response is guided not only by the dialogue history but also by the query knowledge base results, the ability to query the external knowledge base is essential in the EToDs (He et al 2020a;Qin et al 2021;He, Wang, and Chen 2020). Figure 1 illustrates such an example where the user asks for information about the chinese restaurant.…”
Section: Introductionmentioning
confidence: 99%