2023
DOI: 10.3390/systems12010001
|View full text |Cite
|
Sign up to set email alerts
|

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

Victor Kwaku Agbesi,
Wenyu Chen,
Sophyani Banaamwini Yussif
et al.

Abstract: Despite a few attempts to automatically crawl Ewe text from online news portals and magazines, the African Ewe language remains underdeveloped despite its rich morphology and complex "unique" structure. This is due to the poor quality, unbalanced, and religious-based nature of the crawled Ewe texts, thus making it challenging to preprocess and perform any NLP task with current transformer-based language models. In this study, we present a well-preprocessed Ewe dataset for low-resource text classification to th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
references
References 45 publications
0
0
0
Order By: Relevance