Findings of the Association for Computational Linguistics: ACL 2023 2023
DOI: 10.18653/v1/2023.findings-acl.524
|View full text |Cite
|
Sign up to set email alerts
|

Towards Distribution-shift Robust Text Classification of Emotional Content

Abstract: Supervised models based on Transformers have been shown to achieve impressive performances in many natural language processing tasks. However, besides requiring a large amount of costly manually annotated data, supervised models tend to adapt to the characteristics of the training dataset, which are usually created ad-hoc and whose data distribution often differs from the one in real applications, showing significant performance degradation in real-world scenarios. We perform an extensive assessment of the out… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…We also experimented with several state-of-theart text-to-text transformers, which treat all tasks as text generation problems. These transformers have provided excellent results in text classification tasks (Bulla et al, 2023;Sabry et al, 2022;Ni et al, 2022). They do not rely on a classification layer (Raffel et al, 2020) and have a flexible input-output format.…”
Section: Text-to-text Transformersmentioning
confidence: 99%
“…We also experimented with several state-of-theart text-to-text transformers, which treat all tasks as text generation problems. These transformers have provided excellent results in text classification tasks (Bulla et al, 2023;Sabry et al, 2022;Ni et al, 2022). They do not rely on a classification layer (Raffel et al, 2020) and have a flexible input-output format.…”
Section: Text-to-text Transformersmentioning
confidence: 99%