2020
DOI: 10.48550/arxiv.2006.07698
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Transferring Monolingual Model to Low-Resource Language: The Case of Tigrinya

Abrhalei Tela,
Abraham Woubie,
Ville Hautamaki

Abstract: In recent years, transformer models have achieved great success in natural language processing (NLP) tasks. Most of the current state-of-the-art NLP results are achieved by using monolingual transformer models, where the model is pretrained using a single language unlabelled text corpus. Then, the model is fine-tuned to the specific downstream task. However, the cost of pre-training a new transformer model is high for most languages. In this work, we propose a cost-effective transfer learning method to adopt a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…They propose a zero-shot cross-lingual transfer technique where the resultant model is a monolingual LM adapted to a new language. Tela et al 2020 study adaptation to the extremely low resourced language, Tigrinya. They find that English XLNet generalizes better than BERT and mBERT, which is surprising given that mBERT is trained in multiple languages.…”
Section: Languagementioning
confidence: 99%
“…They propose a zero-shot cross-lingual transfer technique where the resultant model is a monolingual LM adapted to a new language. Tela et al 2020 study adaptation to the extremely low resourced language, Tigrinya. They find that English XLNet generalizes better than BERT and mBERT, which is surprising given that mBERT is trained in multiple languages.…”
Section: Languagementioning
confidence: 99%