2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) 2019
DOI: 10.1109/asru46091.2019.9003911
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Neural Dialogue State Tracking

Abstract: A Dialogue State Tracker (DST) is a key component in a dialogue system aiming at estimating the beliefs of possible user goals at each dialogue turn. Most of the current DST trackers make use of recurrent neural networks and are based on complex architectures that manage several aspects of a dialogue, including the user utterance, the system actions, and the slot-value pairs defined in a domain ontology. However, the complexity of such neural architectures incurs into a considerable latency in the dialogue sta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…DST -In recent years, neural network approaches have defined the state-of-the-art in DST research. Most previous works decomposed this task as a series of classification problems (Zhong et al, 2018;Mrksic et al, 2016;Balaraman and Magnini, 2019;Lee et al, 2019). They took each of the slot-value pairs as input for scoring, and output the value with the highest score as the predicted value for a slot.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…DST -In recent years, neural network approaches have defined the state-of-the-art in DST research. Most previous works decomposed this task as a series of classification problems (Zhong et al, 2018;Mrksic et al, 2016;Balaraman and Magnini, 2019;Lee et al, 2019). They took each of the slot-value pairs as input for scoring, and output the value with the highest score as the predicted value for a slot.…”
Section: Related Workmentioning
confidence: 99%
“…To address these limitations, some recent works discarded the classification framework (Xu and Hu, 2018;Wu et al, 2019;Balaraman and Magnini, 2019;. (Xu and Hu, 2018) is the first model that applies pointer networks (Vinyals et al, 2015) to the single-domain DST problem, which generates both start and end pointers to perform index-based copying. )…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Word-based DST (Henderson et al, 2014c) Closed Closed Fixed Function Multi-domain DST (Mrkšić et al, 2015) Closed Closed Fixed Function FS-NBT (Mrkšić and Vulić, 2018) Closed Closed Fixed Function Scalable Multi-domain DST (Rastogi et al, 2017) Closed Closed Fixed Rules CNN-Delex (Wen et al, 2017) Closed Closed Fixed Rules NBT (Mrkšić et al, 2017a) Closed Closed Fixed Rules StateNet (Ren et al, 2018) Closed Open* Fixed Rules Pointer (Xu and Hu, 2018) Open Closed Fixed Rules GLAD (Zhong et al, 2018) Closed Closed Fixed Rules GCE (Nouri and Hosseini-Asl, 2018) Closed Open Fixed Rules GSAT (Balaraman and Magnini, 2019) Closed Closed Fixed Rules BERT-DST (Chao and Lane, 2019) Open Closed Fixed Rules TRADE Open Open* Dynamic None DS-Picklist (Zhang et al, 2019) Closed Open Fixed None SUMBT Closed Open Fixed Function SST Closed Open* Fixed Function SGD-Baseline Open Open Dynamic Rules DA-DST (Balaraman and Magnini, 2021) Open Open Dynamic Rules SOM-DST (Kim et al, 2020) Open Open Dynamic Function TripPy (Heck et al, 2020) Open Open Dynamic Function MinTL Open Open Dynamic Function Nerual Reading (Gao et al, 2019) Open Open Dynamic Function NARDST (Le et al, 2020) Open Open Dynamic None…”
Section: Modelmentioning
confidence: 99%
“…Recent neural network models are proposed for further improvements (Mrkšić et al, 2015;Hori et al, 2016;Lei et al, 2018;Xu and Hu, 2018;Zhong et al, 2018;Nouri and Hosseini-Asl, 2018;Ren et al, 2019;Balaraman and Magnini, 2019). and use an RNN to encode the slot-related information of each turn, where slots can not attend to relevant information of past turns directly.…”
Section: Related Workmentioning
confidence: 99%