2019
DOI: 10.1109/access.2019.2942433
|View full text |Cite
|
Sign up to set email alerts
|

LSTM-CRF Neural Network With Gated Self Attention for Chinese NER

Abstract: Named entity recognition (NER) is an essential part of natural language processing tasks. Chinese NER task is different from the many European languages due to the lack of natural delimiters. Therefore, Chinese Word Segmentation (CWS) is usually regarded as the first step of processing Chinese NER. However, the word-based NER models relying on CWS are more vulnerable to incorrectly segmented entity boundaries and the presence of out-of-vocabulary (OOV) words. In this paper, we propose a novel character-based G… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 49 publications
(23 citation statements)
references
References 25 publications
0
23
0
Order By: Relevance
“…Some research extended ordinary NN with a highway channel to construct deeper trainable networks for NER tasks [35,36]. This optimization based on the gated mechanism is also designed to control information flowing across layers, which showed verifiable improvement on some previous works [37,38]. For example, Kim' experiments [35] with CharCNN and a highway network showed its more powerful generalization ability that greatly enhances the language models on many fundamental NLP tasks.…”
Section: B Dl-based Neural Network For Ner Tasksmentioning
confidence: 99%
“…Some research extended ordinary NN with a highway channel to construct deeper trainable networks for NER tasks [35,36]. This optimization based on the gated mechanism is also designed to control information flowing across layers, which showed verifiable improvement on some previous works [37,38]. For example, Kim' experiments [35] with CharCNN and a highway network showed its more powerful generalization ability that greatly enhances the language models on many fundamental NLP tasks.…”
Section: B Dl-based Neural Network For Ner Tasksmentioning
confidence: 99%
“…Ref. [22,23] achieved state-of-the-art performance on NER by using LSTM-CRF models, with characters being integrated into word representations. With the development of Chinese NER task, on the basis of using the neural network model, researchers have also made good achievements in this task [24].…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, attention mechanism used to learn text classification [35], question answering [36], and named entity recognition [37]. The words in the sentence contain different levels of importance [38].…”
Section: (3) Sentence Level Attention Filter Layermentioning
confidence: 99%