Proceedings of the the Seventh Arabic Natural Language Processing Workshop (WANLP) 2022
DOI: 10.18653/v1/2022.wanlp-1.43
|View full text |Cite
|
Sign up to set email alerts
|

Domain-Adapted BERT-based Models for Nuanced Arabic Dialect Identification and Tweet Sentiment Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Various models were combined and performance was enhanced using a combination of TF-IDF and n-grams. An ensembling of transformerbased models, predominantly using variations of MARBERT was employed for dialect detection as well as sentiment analysis, in the NADI 2022 shared tasks (Bayrak and Issifu, 2022), (Khered et al, 2022). (Oumar and Mrini, 2022) addressed the issue regarding an imbalance in the classes of the NADI dataset by using focal loss and employed various Arabic BERT-based models.…”
Section: Related Workmentioning
confidence: 99%
“…Various models were combined and performance was enhanced using a combination of TF-IDF and n-grams. An ensembling of transformerbased models, predominantly using variations of MARBERT was employed for dialect detection as well as sentiment analysis, in the NADI 2022 shared tasks (Bayrak and Issifu, 2022), (Khered et al, 2022). (Oumar and Mrini, 2022) addressed the issue regarding an imbalance in the classes of the NADI dataset by using focal loss and employed various Arabic BERT-based models.…”
Section: Related Workmentioning
confidence: 99%
“…(Abdel-Salam, 2022) used is an ensemble between fine-tuned BERT-based models and various approaches of parameter efficient tuning including p-tuning and prompt-tuning. (Bayrak and Issifu, 2022) used general pre-training as a first step followed by fine-tuning. AlKhamissi et al (2021) added an adapter layer on top of the MARBERT model.…”
Section: Introductionmentioning
confidence: 99%