2020
DOI: 10.1016/j.knosys.2020.106292
|View full text |Cite
|
Sign up to set email alerts
|

SK-GCN: Modeling Syntax and Knowledge via Graph Convolutional Network for aspect-level sentiment classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
68
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 169 publications
(68 citation statements)
references
References 38 publications
0
68
0
Order By: Relevance
“…Results: To demonstrate the effectiveness of the proposed method, we compare it with the following baselines: (1) the feature-based model that applies feature engineering and the SVM model (Wagner et al, 2014), (2) the deep learning models based on the sequential order of the words in the sentences, including CNN, LSTM, attention and the gating mechanism (Wagner et al, 2016;Wang et al, 2016;Tang et al, 2016;Huang et al, 2018;, and (3) the graph-based models that exploit dependency trees to improve the deep learning models for ABSA (Huang and Carley, 2019;Hou et al, 2019;Wang et al, 2020).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Results: To demonstrate the effectiveness of the proposed method, we compare it with the following baselines: (1) the feature-based model that applies feature engineering and the SVM model (Wagner et al, 2014), (2) the deep learning models based on the sequential order of the words in the sentences, including CNN, LSTM, attention and the gating mechanism (Wagner et al, 2016;Wang et al, 2016;Tang et al, 2016;Huang et al, 2018;, and (3) the graph-based models that exploit dependency trees to improve the deep learning models for ABSA (Huang and Carley, 2019;Hou et al, 2019;Wang et al, 2020).…”
Section: Methodsmentioning
confidence: 99%
“…The current state-of-the-art deep learning models for ABSA feature the graph-based models where the dependency trees are leveraged to improve the performance. (Huang and Carley, 2019;Hou et al, 2019). However, to the best of our knowledge, none of these works has used the information from the aspect term to filter the graph-based hidden vectors and exploited importance scores for words from dependency trees as we do in this work.…”
Section: Related Workmentioning
confidence: 99%
“…proposed a GCN model over the dependency tree of the sentence to enhance the feature representations of aspects learned by a Bi-directional LSTM (Bi-LSTM). In addition, to develop the merit of BERT (Devlin et al, 2019), a GCN model based on selective attention was proposed to extract and aggregate the most important contextual features for the aspect representation (Hou et al, 2019). The above GCN-based models, however, neither considered the specific aspect when constructing the graph of the sentence nor extracted inter-aspect sentiment relations for the specific aspect.…”
Section: Related Workmentioning
confidence: 99%
“…AEN+BERT is the AEN model based on pre-trained BERT. SA-GCN+BERT (Hou et al, 2019) is a GCN-based model with dependency tree and BERT, which employs the selective attention find important words to derive the representations of aspects. InterGCN is our complete proposed model.…”
Section: Comparison Modelsmentioning
confidence: 99%
“…of NLP tasks. For example, some researchers incorporated a hard constraint into SANs to select a subset of input words, on top of which self-attention is conducted (Shen et al, 2018c;Hou et al, 2019;Yang et al, 2019b). and Guo et al (2019) proposed a soft mechanism by imposing a learned Gaussian bias over the original attention distribution to enhance its ability of capturing local contexts.…”
Section: Introductionmentioning
confidence: 99%