2021
DOI: 10.1080/03772063.2021.2006805
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Neural Network-Based Model for Named Entity Recognition with Improved Word Embeddings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 13 publications
0
5
0
Order By: Relevance
“…They have utilized to forecast the patterns by using CRFs and BiLSTM models. [93] proposed an innovative strategy for creating a bilingual NER system as the IWE is used by the proposed approach to create the deep learning-based model. The authors applied Bi-LSTM [85] based on DL which contains POS and CNN embedding character and overall accuracy has been calculated by using F1-score, recall and precision.…”
Section: (C) Deep Learning Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…They have utilized to forecast the patterns by using CRFs and BiLSTM models. [93] proposed an innovative strategy for creating a bilingual NER system as the IWE is used by the proposed approach to create the deep learning-based model. The authors applied Bi-LSTM [85] based on DL which contains POS and CNN embedding character and overall accuracy has been calculated by using F1-score, recall and precision.…”
Section: (C) Deep Learning Approachmentioning
confidence: 99%
“…They deleted the IOB formatting from the IJCNLP 200 NER Corpus in accordance with previous work on the corpus. It also has mainly three dataset such as IJCNLP 200 NER [10], IJCNLP-2008 [21], [78] and IJCNLP-08 [93].…”
Section: Datasetmentioning
confidence: 99%
See 1 more Smart Citation
“…Thirdly, deep learning models consisting of many stacked neural networks are widely used in NER, such as RNN [25,26], Long Short-Time Memory (LSTM) [27], Bidirectional LSTM [28,29], BiLSTM-CRF [30,31], BERT-BiLSTM-CRF [6,32], ALBERT-BiLSTM-CRF [8], and attention mechanisms [33], which have replaced statistical machine learning techniques. This method can handle high-dimensional data with excellent accuracy.…”
Section: Named Entity Recognition Techniquesmentioning
confidence: 99%
“…Following this, several additional studies also investigated RNNs for sequence-labeling problems. The authors of [20,[31][32][33] used LSTM, whereas the authors of [34][35][36][37] used GRU in completing their NER tasks. In addition, the RNN-based model, the authors of [38] employ RNNs for nested NER problems and the authors of [39] use RNNs for NER in Chinese electronic medical records.…”
Section: Recurrent Network Model For Nermentioning
confidence: 99%