2018 9th International Symposium on Telecommunications (IST) 2018
DOI: 10.1109/istel.2018.8661067
|View full text |Cite
|
Sign up to set email alerts
|

Improved Deep Persian Named Entity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…On the other hand, due to a low tag count of Time and resemblance to Date make BERT-PersNER acts worst on Time. Table 3 shows BERT-PersNER performance against two recent studies (Shahshahani et al, 2019;Bokaei and Mahmoudi, 2018). As it is observed, our proposed model outperforms the baselines on Arman.…”
Section: Methodsmentioning
confidence: 76%
See 2 more Smart Citations
“…On the other hand, due to a low tag count of Time and resemblance to Date make BERT-PersNER acts worst on Time. Table 3 shows BERT-PersNER performance against two recent studies (Shahshahani et al, 2019;Bokaei and Mahmoudi, 2018). As it is observed, our proposed model outperforms the baselines on Arman.…”
Section: Methodsmentioning
confidence: 76%
“…For the first component, Word2Vec (Mikolov et al, 2013) was used in (Lample et al, 2016) and both Word2Vec and GloVe (Pennington et al, 2014) were used in (Poost-chi et al, 2018). As the second component, in most studies, a bidirectional long short-term memory (BiLSTM) has been used to capture long-distance dependencies (Li et al, 2020;Shahshahani et al, 2019;Lample et al, 2016;Bokaei and Mahmoudi, 2018;Poostchi et al, 2018). In other studies, an architecture named Transformer (Vaswani et al, 2017) was used.…”
Section: Named Entity Recognitionmentioning
confidence: 99%
See 1 more Smart Citation
“…Another approach for Persian NER is provided by [33] which combines a rule-based grammatical approach. Moreover, a Deep Learning approach for Persian NER is provided in [34] facilitating bidirectional LSTM networks. Beheshti-NER [35] uses multilingual Google BERT to form a fine-tuned model for Persian NER and is the closest work to present work.…”
Section: Nlp Downstream Tasksmentioning
confidence: 99%
“…Another approach for Persian NER is provided by [33] which combines a rulebased grammatical approach. Moreover, a Deep Learning approach for Persian NER is provided in [34] facilitating bidirectional LSTM networks. Beheshti-NER [35] uses multilingual Google BERT to form a fine-tuned model for Persian NER and is the closest work to present work.…”
Section: Nlp Downstream Tasksmentioning
confidence: 99%