Proceedings of the 2nd Workshop on Natural Language Processing for Conversational AI 2020
DOI: 10.18653/v1/2020.nlp4convai-1.5
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Intent Detection with Dual Sentence Encoders

Abstract: Building conversational systems in new domains and with added functionality requires resource-efficient models that work under lowdata regimes (i.e., in few-shot setups). Motivated by these requirements, we introduce intent detection methods backed by pretrained dual sentence encoders such as USE and Con-veRT. We demonstrate the usefulness and wide applicability of the proposed intent detectors, showing that: 1) they outperform intent detectors based on fine-tuning the full BERT-Large model or using BERT as a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
183
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 207 publications
(186 citation statements)
references
References 27 publications
3
183
0
Order By: Relevance
“…Note that, besides quicker pretraining, intent classifiers based on ConveRT encodings train 40 times faster than BERT-LARGE-based ones, as only the classification layers are trained for ConveRT. Additional experiments related to efficiency of intent classification have been conducted by Casanueva et al (2020). In sum, these preliminary results suggest that ConveRT as a sentence encoder can be useful beyond the core response selection task.…”
Section: Bankingmentioning
confidence: 86%
See 3 more Smart Citations
“…Note that, besides quicker pretraining, intent classifiers based on ConveRT encodings train 40 times faster than BERT-LARGE-based ones, as only the classification layers are trained for ConveRT. Additional experiments related to efficiency of intent classification have been conducted by Casanueva et al (2020). In sum, these preliminary results suggest that ConveRT as a sentence encoder can be useful beyond the core response selection task.…”
Section: Bankingmentioning
confidence: 86%
“…In sum, these preliminary results suggest that ConveRT as a sentence encoder can be useful beyond the core response selection task. The usefulness of ConveRT-based sentence representations have been recently confirmed on other intent classification datasets (Casanueva et al, 2020), with different intent classifiers (Bunk et al, 2020), and in another dialog task: turn-based value extraction (Coope et al, 2020;Bunk et al, 2020;Mehri et al, 2020). In future work, we plan to investigate other possible applications of transfer, especially for the challenging low-data setups.…”
Section: Bankingmentioning
confidence: 87%
See 2 more Smart Citations
“…Notice that the intent classifier is typically implemented using standard text classification algorithms (Weiss et al, 2012;Larson et al, 2019;Casanueva et al, 2020). Consequently, to perform OOS sample detection, methods often rely on one-class classification or threshold rejectionbased techniques using the probability outputs for each class (Larson et al, 2019) or reconstruction errors (Ryu et al, 2017(Ryu et al, , 2018.…”
Section: Introductionmentioning
confidence: 99%