2021
DOI: 10.1007/978-3-030-71590-8_4
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Named Entity Recognition: Applications and Challenges

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 27 publications
0
0
0
Order By: Relevance
“…Compared to English NER, Chinese NER faces more challenges primarily due to the difficulty in determining entity boundaries in Chinese text and the complex syntactical structure of the Chinese language. Previous research [4] compared character-based and word-based approaches, and characterbased NER methods often fail to fully harness explicit word and word sequence information, despite its potential value. To leverage lexical features, Zhang et al [5] proposed the Lattice-LSTM model, which encodes all words matched by individual characters in a sentence into a DAG.…”
Section: Lexical Information For Nermentioning
confidence: 99%
See 2 more Smart Citations
“…Compared to English NER, Chinese NER faces more challenges primarily due to the difficulty in determining entity boundaries in Chinese text and the complex syntactical structure of the Chinese language. Previous research [4] compared character-based and word-based approaches, and characterbased NER methods often fail to fully harness explicit word and word sequence information, despite its potential value. To leverage lexical features, Zhang et al [5] proposed the Lattice-LSTM model, which encodes all words matched by individual characters in a sentence into a DAG.…”
Section: Lexical Information For Nermentioning
confidence: 99%
“…CNN block [2,4] CNN dim [120,200] • Lattice-LSTM [5]: For the Chinese NER task, an LSTM model using a lattice structure, Lattice-LSTM, is proposed to encode both the character features of the input sequence and all potential words matched with the lexicon for NER after fusing the information of words and word sequences. • CAN-NER [52]: Extract local character information through CNN, and then capture adjacent character or context information using a global self-attention layer composed of GRU.…”
Section: Biaffine Size 400mentioning
confidence: 99%
See 1 more Smart Citation