Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.12
|View full text |Cite
|
Sign up to set email alerts
|

Zero-Shot Transfer Learning with Synthesized Data for Multi-Domain Dialogue State Tracking

Abstract: Zero-shot transfer learning for multi-domain dialogue state tracking can allow us to handle new domains without incurring the high cost of data acquisition. This paper proposes new zero-short transfer learning technique for dialogue state tracking where the in-domain training data are all synthesized from an abstract dialogue model and the ontology of the domain. We show that data augmentation through synthesized data can improve the accuracy of zero-shot learning for both the TRADE model and the BERT-based SU… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
62
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 64 publications
(62 citation statements)
references
References 22 publications
0
62
0
Order By: Relevance
“…They Kumar et al (2020) and Li et al (2021) follow the same experimental setup, and we also compare NeuralWOZ in the same experiment setup. Abstract Transaction Dialogue Model (ATDM) (Campagna et al, 2020), another method for synthesizing dialogue data, is another baseline for zero-shot domain transfer tasks we adopt. They use rules, abstract state transition, and templates to synthesize the dialogue, which is then fed into a model-based zero-shot learner.…”
Section: Zero-shot Domain Transfermentioning
confidence: 99%
See 2 more Smart Citations
“…They Kumar et al (2020) and Li et al (2021) follow the same experimental setup, and we also compare NeuralWOZ in the same experiment setup. Abstract Transaction Dialogue Model (ATDM) (Campagna et al, 2020), another method for synthesizing dialogue data, is another baseline for zero-shot domain transfer tasks we adopt. They use rules, abstract state transition, and templates to synthesize the dialogue, which is then fed into a model-based zero-shot learner.…”
Section: Zero-shot Domain Transfermentioning
confidence: 99%
“…Other studies on dialogue datasets use the user simulator-based data collection approaches (Schatzmann et al, 2007;Li et al, 2017;Bordes et al, 2017;Shah et al, 2018;Zhao and Eskenazi, 2018;Shah et al, 2018;Campagna et al, 2020). They define domain schema, rules, and dialogue templates to simulate user behavior under certain goals.…”
Section: Synthetic Dialogue Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…Dialogue diversity can be improved via enforcing diversity objective functions (such as maximize mutual information) in neural models (Li et al, 2016a;Baheti et al, 2018), perturbing language rules (Niu and Bansal, 2019) or environment parameters Ruiz et al, 2019), randomizing trajectory synthesis (Andrychowicz et al, 2017;Lu et al, 2019), selecting more diverse data contributors (Stasaski et al, 2020), and sampling trajectories from a diverse set of environments (Chua et al, 2018;Janner et al, 2019). For instance, Campagna et al augmented dialogue data using domain-independent transition rules and domain-specific ontology (Campagna et al, 2020). Niu and Bansal synthesized more diverse dialogue trajectories by choosing semantic-preserving language perturbations via RL (Niu and Bansal, 2019).…”
Section: Diversification In Dialoguesmentioning
confidence: 99%
“…Stateof-the-art methods, based on sequence-to-sequence neural networks, require large amounts of manually annotated data (Dong and Lapata, 2016;Jia and Liang, 2016). Various methods have been proposed to eliminate manually annotated data for new domains, using synthesis (Wang et al, 2015;Shah et al, 2018;Campagna et al, 2019;Xu et al, 2020a,b), transfer learning (Zhong et al, 2017;Herzig and Berant, 2018;Yu et al, 2018;, or a combination of both (Rastogi et al, 2019;Campagna et al, 2020). All these works focus mainly on the English language, and have not been applied to other languages.…”
Section: Related Workmentioning
confidence: 99%