2020
DOI: 10.1007/978-3-030-60457-8_28
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Question Classification Based on ERNIE and Feature Fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…Recently, medical Q&A and text classification methods have been greatly improved, however, recent studies rarely considered this issue from the perspective of feature fusion. It was noteworthy that Liu et al [26] proposed a new method for Chinese question classification from the perspective of ERNIE and feature fusion. They firstly developed a generalized language representation model by integrating knowledge, then adopted the technology of Highway-CNN and Highway-DCU-BiLSTM to extract local features and sequence features separately, and finally integrated the two features by a linear formula.…”
Section: B Related Methodsmentioning
confidence: 99%
“…Recently, medical Q&A and text classification methods have been greatly improved, however, recent studies rarely considered this issue from the perspective of feature fusion. It was noteworthy that Liu et al [26] proposed a new method for Chinese question classification from the perspective of ERNIE and feature fusion. They firstly developed a generalized language representation model by integrating knowledge, then adopted the technology of Highway-CNN and Highway-DCU-BiLSTM to extract local features and sequence features separately, and finally integrated the two features by a linear formula.…”
Section: B Related Methodsmentioning
confidence: 99%
“…ERNIE is a pretrained language model for Chinese corpus. Question classification has been resolved by considering the ERNIE pretraining model together with feature fusion [42]. Specifically, RoBERTa [43] has been implemented and fine-tuned for Chinese text classification, and ChineseBERT [44] incorporates both Chinese character glyph and pinyin information.…”
Section: Plos Onementioning
confidence: 99%