2019 IEEE-RIVF International Conference on Computing and Communication Technologies (RIVF) 2019
DOI: 10.1109/rivf.2019.8713663
|View full text |Cite
|
Sign up to set email alerts
|

Aspect Extraction with Bidirectional GRU and CRF

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 21 publications
0
12
0
Order By: Relevance
“…(1) Frequency-based methods [1,2]; (2) Rule/template-based methods [3,4]; (3) Graph theory-based methods [6][7][8]; (4) Based on CRF or combined CRF and deep learning methods [5,[9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25].…”
Section: Overview Of the Methods Of Opinion Targets Extractionmentioning
confidence: 99%
See 1 more Smart Citation
“…(1) Frequency-based methods [1,2]; (2) Rule/template-based methods [3,4]; (3) Graph theory-based methods [6][7][8]; (4) Based on CRF or combined CRF and deep learning methods [5,[9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25].…”
Section: Overview Of the Methods Of Opinion Targets Extractionmentioning
confidence: 99%
“…Hu [10] based on the CRF method, and with the help of the recurrent neural network (RNN) model, carried out related research on the extraction of English and Dutch named entity recognition(NER) datasets, and achieved good results in terms of accuracy. Tran [22] combined bidirectional gated recurrent unit (BiGRU) and CRF to extract aspects of the SemEval-2014 dataset, obtaining better accuracy than the state-of-the-art methods. The methods above achieved good results in terms of extraction accuracy.…”
Section: Research Situationmentioning
confidence: 99%
“… Poria, Cambria & Gelbukh (2016) proposed a non-linear, supervised CNN with linguistic patterns to process OE task. Tran, Hoang & Huynh (2019) proposed a model that combined BiGRU and CRF, where BiGRU considers the long-distance dependency relationship of sentences and CRF considers the relationship of label transfer. Xu et al (2018) employed two types of pre-training embedding ( i.e.…”
Section: Information Extraction (Ie)mentioning
confidence: 99%
“…In [24] found the sentiment of each aspect in opinion text employing multiple attention control mechanisms on memory in a recurrent attention network. Our previous approaches in [25], [26] recommended the two models with the integration of CRF and RNN variants: gated recurrent unit (GRU) and IndyLSTM. We made the best use of the two-way mechanism for BiGRU and Bi-IndyLSTM to classify aspects proficiently.…”
Section: Introductionmentioning
confidence: 99%