2019
DOI: 10.1609/aaai.v33i01.33016642
|View full text |Cite
|
Sign up to set email alerts
|

Zero-Shot Adaptive Transfer for Conversational Language Understanding

Abstract: Conversational agents such as Alexa and Google Assistant constantly need to increase their language understanding capabilities by adding new domains. A massive amount of labeled data is required for training each new domain. While domain adaptation approaches alleviate the annotation cost, prior approaches suffer from increased training time and suboptimal concept alignments. To tackle this, we introduce a novel Zero-Shot Adaptive Transfer method for slot tagging that utilizes the slot description for transfer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
59
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 55 publications
(59 citation statements)
references
References 11 publications
0
59
0
Order By: Relevance
“…Baselines: We compare with two strong zeroshot baselines: Zero-shot Adaptive Transfer (ZAT) (Lee and Jha, 2018) and Concept Tagger We sample positive and negative instances (Figure 3) in a ratio of 1:3. Slot values input during training and evaluation are randomly picked from values taken by the input slot in the relevant domain's training set, excluding ones that are also present in the evaluation set.…”
Section: Datasets and Experimentsmentioning
confidence: 99%
“…Baselines: We compare with two strong zeroshot baselines: Zero-shot Adaptive Transfer (ZAT) (Lee and Jha, 2018) and Concept Tagger We sample positive and negative instances (Figure 3) in a ratio of 1:3. Slot values input during training and evaluation are randomly picked from values taken by the input slot in the relevant domain's training set, excluding ones that are also present in the evaluation set.…”
Section: Datasets and Experimentsmentioning
confidence: 99%
“…Label semantics provide contextual signals that can improve model performance in multi-task and low-resource scenarios. Multiple works show that conditioning input representations on slot description embeddings improves multidomain slot labeling performance (Bapna et al, 2017;Lee and Jha, 2019). Embedding example slot values in addition to slot descriptions yields further improvements in zero-shot slot labeling (Shah et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…• Concept Tagger (CT) [35] and Zero-shot Adaptive Transfer (ZAT) [36]: These two zero-shot methods also condition slot filling on textual slot descriptions, which have a contextual BLSTM layer and slot dependent BLSTM layer. However, they predict an IOB sequence for each slot, which may lead to overlapped segmentation and cost more time.…”
Section: B Baselinesmentioning
confidence: 99%