Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.27
|View full text |Cite
|
Sign up to set email alerts
|

Augmented Natural Language for Generative Sequence Labeling

Abstract: We propose a generative framework for joint sequence labeling and sentence-level classification. Our model performs multiple sequence labeling tasks at once using a single, shared natural language output space. Unlike prior discriminative methods, our model naturally incorporates label semantics and shares knowledge across tasks. Our framework is general purpose, performing well on fewshot, low-resource, and high-resource tasks. We demonstrate these advantages on popular named entity recognition, slot labeling… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
29
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(29 citation statements)
references
References 25 publications
0
29
0
Order By: Relevance
“…The main idea is to simulate fast update scenarios during training, and update the model's parameters so that the model performs fast updates more efficiently. Fast update with meta learning has been applied to NLP models for generalizing to unseen tasks or domains (Gu et al, 2018a;Dou et al, 2019;Bansal et al, 2020;Athiwaratkun et al, 2020;Wang et al, 2021).…”
Section: Metric Learningmentioning
confidence: 99%
“…The main idea is to simulate fast update scenarios during training, and update the model's parameters so that the model performs fast updates more efficiently. Fast update with meta learning has been applied to NLP models for generalizing to unseen tasks or domains (Gu et al, 2018a;Dou et al, 2019;Bansal et al, 2020;Athiwaratkun et al, 2020;Wang et al, 2021).…”
Section: Metric Learningmentioning
confidence: 99%
“…Inspired by recent success in formulating various NLP tasks as text generation problems (Athiwaratkun et al, 2020;Paolini et al, 2021;, we propose to tackle ASQP in a sequenceto-sequence (S2S) manner in this paper. On one hand, the sentiment quads can be predicted in an end-to-end manner, alleviating the potential error propagation in the pipeline solutions.…”
Section: Introductionmentioning
confidence: 99%
“…Motivated by recent success in formulating sev-eral language understanding problems such as named entity recognition, question answering, and text classification as generation tasks (Raffel et al, 2020;Athiwaratkun et al, 2020), we propose to tackle various ABSA problems in a unified generative approach in this paper. It can fully utilize the rich label semantics by encoding the natural language label into the target output.…”
Section: Introductionmentioning
confidence: 99%