Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.567
|View full text |Cite
|
Sign up to set email alerts
|

SAS: Dialogue State Tracking via Slot Attention and Slot Information Sharing

Abstract: Dialogue state tracker is responsible for inferring user intentions through dialogue history. Previous methods have difficulties in handling dialogues with long interaction context, due to the excessive information. We propose a Dialogue State Tracker with Slot Attention and Slot Information Sharing (SAS) to reduce redundant information's interference and improve long dialogue context tracking. Specially, we first apply a Slot Attention to learn a set of slot-specific features from the original dialogue and th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(45 citation statements)
references
References 19 publications
0
45
0
Order By: Relevance
“…With the recent development of deep learning and representation learning, most works about DST focus on encoding dialogue context with deep neural networks and predicting a value for each possible slot (Xu and Hu, 2018;Zhong et al, 2018;. For multi-domain DST, slot-value pairs are extended to domain-slot-value pairs for the target Wu et al, 2019;Chen et al, 2020b;Hu et al, 2020;Heck et al, 2020;. These models greatly improve the performance of DST, but the mechanism of treating slots equally is inefficient and may lead to additional errors.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…With the recent development of deep learning and representation learning, most works about DST focus on encoding dialogue context with deep neural networks and predicting a value for each possible slot (Xu and Hu, 2018;Zhong et al, 2018;. For multi-domain DST, slot-value pairs are extended to domain-slot-value pairs for the target Wu et al, 2019;Chen et al, 2020b;Hu et al, 2020;Heck et al, 2020;. These models greatly improve the performance of DST, but the mechanism of treating slots equally is inefficient and may lead to additional errors.…”
Section: Related Workmentioning
confidence: 99%
“…DS-DST uses two BERT-base encoders and takes a hybrid approach . SAS proposes a Dialogue State Tracker with Slot Attention and Slot Information Sharing to reduce redundant information's interference (Hu et al, 2020). SOM-DST considers the dialogue state as an explicit fixedsize memory and proposes a selectively overwriting mechanism (Kim et al, 2020).…”
Section: Baseline Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…At each step, the new data (D i ) can contain one or multiple new domains. In addition, inspired by other lifelong learning work (Lopez-Paz and Ranzato, 2017;Zenke et al, 2017) across the same multiple domains as data of a special domain, since these cross-domain dialogues usually contain specific expressions that distinguish them from other dialogues, such as domain transformation and slot reference (Ouyang et al, 2020;Hu et al, 2020) ). The updated model should still perform well on all previous domains.…”
Section: Task Formulationmentioning
confidence: 99%
“…It involves 7 domains and 18 slots, which form 35 domain-slot pairs. We conducted evaluations on the latest MultiWOZ 2.1 dataset [15], which 35.57 -DST Reader [17] 39.41 36.4 COMER [9] 45.72 -TRADE [8] 48.62 45.6 NADST [18] 50.52 49.0 SAS [19] 51.03 -DST-SC [20] 52.24 49.6 PRO-DST (Ours) 51.48 49.9…”
Section: Datasetsmentioning
confidence: 99%