2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES) 2020
DOI: 10.1109/niles50944.2020.9257975
|View full text |Cite
|
Sign up to set email alerts
|

A Multi-Embeddings Approach Coupled with Deep Learning for Arabic Named Entity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…In the medical domain, [88] tiveness of model ensembles in technical fields. [89] used a multi-embeddings approach coupled with deep learning for Arabic NER, achieving an F1 score of 77.62 on the AQMAR dataset, thus setting a new performance standard for NER in Arabic. [90] utilized the transformer model XLM-Roberta for Hindi NER, achieving F1-scores of 0.96 (micro) and 0.80 (macro), further validating the effectiveness of transformers in language-specific NER tasks.…”
Section: Model Evaluation and Comparisonmentioning
confidence: 99%
“…In the medical domain, [88] tiveness of model ensembles in technical fields. [89] used a multi-embeddings approach coupled with deep learning for Arabic NER, achieving an F1 score of 77.62 on the AQMAR dataset, thus setting a new performance standard for NER in Arabic. [90] utilized the transformer model XLM-Roberta for Hindi NER, achieving F1-scores of 0.96 (micro) and 0.80 (macro), further validating the effectiveness of transformers in language-specific NER tasks.…”
Section: Model Evaluation and Comparisonmentioning
confidence: 99%
“…Reference [22] is another research that combined different types of classical and contextual embeddings: pre-trained word embeddings such as FastText and Aravec, pooled contextual embeddings, and AraBERT embeddings for processing Arabic Named Entity Recognition (NER) task on the AQMAR dataset. These embeddings are then fed into the Bi-LSTM.…”
Section: Using Arabert Model In Nlp Tasksmentioning
confidence: 99%