2023
DOI: 10.1609/aaai.v37i11.26603
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty-Aware Self-Training for Low-Resource Neural Sequence Labeling

Abstract: Neural sequence labeling (NSL) aims at assigning labels for input language tokens, which covers a broad range of applications, such as named entity recognition (NER) and slot filling, etc. However, the satisfying results achieved by traditional supervised-based approaches heavily depend on the large amounts of human annotation data, which may not be feasible in real-world scenarios due to data privacy and computation efficiency issues. This paper presents SeqUST, a novel uncertain-aware self-training framework… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 40 publications
0
1
0
Order By: Relevance
“…Zhou et al [23] proposed a two-stage learning pipeline to tackle oncological NER task in Chinese language, which is a typical task lacking training resources. Wang et al [24] proposed SeqUST, a novel uncertain-aware self-training framework for NSL to address the labeled data scarcity issue and to effectively utilize unlabeled data. Chen et al [25] proposed a CP-NER model to solve the problem of cross-domain resource scarcity in practical scenarios.…”
Section: Introductionmentioning
confidence: 99%
“…Zhou et al [23] proposed a two-stage learning pipeline to tackle oncological NER task in Chinese language, which is a typical task lacking training resources. Wang et al [24] proposed SeqUST, a novel uncertain-aware self-training framework for NSL to address the labeled data scarcity issue and to effectively utilize unlabeled data. Chen et al [25] proposed a CP-NER model to solve the problem of cross-domain resource scarcity in practical scenarios.…”
Section: Introductionmentioning
confidence: 99%