2020
DOI: 10.1007/s11432-020-2982-y
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical LSTM with char-subword-word tree-structure representation for Chinese named entity recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Some other researchers Hu et al 2019;Wang et al 2018c;Yan et al 2019] also imitate human's łread + verifyž reading pattern. Besides, there are other kinds of human reading patterns imitated, like the pattern of restoring a scene according to the text to understand the passage comprehensively [Tian et al 2020], the pattern of human gaze during reading comprehension [Malmaud et al 2020], the pattern of tactical comparing and reasoning over candidates while choosing the best answer [Chen et al 2020], etc. Here we classify all these existing models as a kind of shallow understanding based methods, since they pay more attention to these reading patterns' supericial frameworks, but ignore some important understandings hidden in these patterns.…”
Section: Accuratementioning
confidence: 99%
See 3 more Smart Citations
“…Some other researchers Hu et al 2019;Wang et al 2018c;Yan et al 2019] also imitate human's łread + verifyž reading pattern. Besides, there are other kinds of human reading patterns imitated, like the pattern of restoring a scene according to the text to understand the passage comprehensively [Tian et al 2020], the pattern of human gaze during reading comprehension [Malmaud et al 2020], the pattern of tactical comparing and reasoning over candidates while choosing the best answer [Chen et al 2020], etc. Here we classify all these existing models as a kind of shallow understanding based methods, since they pay more attention to these reading patterns' supericial frameworks, but ignore some important understandings hidden in these patterns.…”
Section: Accuratementioning
confidence: 99%
“…The main deiciency of these existing models is that they ignore the underlying motivations of human readers using diverse reading patterns are to comprehensively understand the semantic meaning of the given documents and questions. Some researchers [Gong et al 2020a;Guo et al 2020b;Mihaylov and Frank 2019;] explore the semantic information understanding issue, but their methods either require some prerequisite resources like an extra knowledge base [Guo et al 2020b] or the linguistic annotations [Mihaylov and Frank 2019], or depend on some large scale pretrained language models ]. We further notice that there are usually three kinds of hierarchical understandings when human readers conduct a reading comprehension task, including the semantic meaning understanding of words, the interaction understanding between the input question and documents, and the answer supporting cue understanding among diferent documents.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…MECT is a recent advanced NER model using cross-transformers, in which the character-level representation was enhanced with the knowledge of Chinese structural components (Wu, Song, and Feng 2021). Some studies looked into more effective neural architectures based on the multi-feature ensemble approaches (Chaudhary et al 2018;Zhang et al 2019;Gong et al 2020). For example, Gong developed a tree-structure representation based on a hierarchical long short-term memory (HiLSTM) framework by extracting the features of characters, subwords, and context-aware predicted words (Gong et al 2020).…”
Section: Subword-based Approach For Entity Extractionmentioning
confidence: 99%