2022
DOI: 10.48550/arxiv.2203.12907
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Mono vs Multilingual BERT: A Case Study in Hindi and Marathi Named Entity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…The monolingual Marathi models released in this work have been shown to work better than the currently available alternatives. A study conducted for Marathi named entity recognition highlights the importance of our models and datasets [11]. A similar study for Marathi text classification and specifically hate speech detection was conducted in [18].…”
Section: Impactmentioning
confidence: 98%
“…The monolingual Marathi models released in this work have been shown to work better than the currently available alternatives. A study conducted for Marathi named entity recognition highlights the importance of our models and datasets [11]. A similar study for Marathi text classification and specifically hate speech detection was conducted in [18].…”
Section: Impactmentioning
confidence: 98%
“…Named entity recognition (NER) for Hindi and Marathi, two low-resource Indian languages model, is introduced by Litake et al 21 . For NER tasks, transformer-based models are frequently employed.…”
Section: Hindi Text Classificationmentioning
confidence: 99%
“…We evaluate monolingual and multilingual BERT models on Marathi corpus to compare the performance. A similar analysis for Hindi and Marathi named entity recognition has been performed in [13].…”
Section: Introductionmentioning
confidence: 99%