Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/603
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Converse with Noisy Data: Generation with Calibration

Abstract: The availability of abundant conversational data on the Internet brought prosperity to the generation-based open domain conversation systems. In the training of the generation models, existing methods generally treat all the training data equivalently. However, the data crawled from the websites may contain many noises. Blindly training with the noisy data could harm the performance of the final generation model. In this paper, we propose a generation with calibration framework, that allows high- quality data … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 32 publications
(28 citation statements)
references
References 5 publications
0
28
0
Order By: Relevance
“…According to the model approach, models are trained while handling noise at the same time. For example, Shang et al (2018) proposed a method with a calibration framework and demonstrated its effectiveness on a Chinese corpus. According to the data approach, training data are pre-processed with the aim of improving their quality before training models.…”
Section: Introductionmentioning
confidence: 99%
“…According to the model approach, models are trained while handling noise at the same time. For example, Shang et al (2018) proposed a method with a calibration framework and demonstrated its effectiveness on a Chinese corpus. According to the data approach, training data are pre-processed with the aim of improving their quality before training models.…”
Section: Introductionmentioning
confidence: 99%
“…Many efficient approaches have been proposed for developing intelligent dialogue systems (He et al, 2017;Shang et al, 2018;Tian et al, 2019;Cai et al, 2019). For multi-turn dialogue systems, Serban et al (2016) and Xing et al (2018) adopt hierarchical neural networks to model context.…”
Section: Multi-turn Dialogue Systemsmentioning
confidence: 99%
“…Our work is inspired by the work of using new learning strategies to distinguish the noise in training data [7,10,15]. Shang et al [10] and Lison et al [7] utilized instance weighting strategy in open domain dialog systems via simple methods. Wu et al [15] altered the negative sampling strategy and utilized a sequence-to-sequence model to distinguish false negative samples.…”
Section: Related Workmentioning
confidence: 99%