2023
DOI: 10.1016/j.knosys.2023.110662
|View full text |Cite
|
Sign up to set email alerts
|

An aspect sentiment classification model for graph attention networks incorporating syntactic, semantic, and knowledge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…Li, Li, and Xiao (2023) proposed an aspect supervision contrastive learning (APSCL) model, to capture the potential relationship between multiple aspects in the emotion subspace. To alleviate the complexity and insensitivity of syntactic relations, Zhang et al (2023) proposed the SSK‐GAT model that integrates syntax, semantics and knowledge. A GMF‐SKIA model based on gated mechanism with fused emotion knowledge and aspect interdependence was proposed by Han et al (2023) And this model is able to dynamically fuse the emotion knowledge information and aspect interdependence of words.…”
Section: Related Workmentioning
confidence: 99%
“…Li, Li, and Xiao (2023) proposed an aspect supervision contrastive learning (APSCL) model, to capture the potential relationship between multiple aspects in the emotion subspace. To alleviate the complexity and insensitivity of syntactic relations, Zhang et al (2023) proposed the SSK‐GAT model that integrates syntax, semantics and knowledge. A GMF‐SKIA model based on gated mechanism with fused emotion knowledge and aspect interdependence was proposed by Han et al (2023) And this model is able to dynamically fuse the emotion knowledge information and aspect interdependence of words.…”
Section: Related Workmentioning
confidence: 99%
“…Wang et al reconstructed the syntax dependency tree by pruning the irrelevant edges and encode the revised dependency tree using GAT [16]. Considering the interaction between syntactic and semantic information, multi-channel GCN-based approaches has been proposed to extract syntactic and semantic information [18,19,23]. Zhang et al [20] generated an attention matrix and a syntactic mask matrix based on word-syntactic distances, which are processed to enhance the interaction between semantics and syntax.…”
Section: Aspect-level Sentiment Analysismentioning
confidence: 99%
“…• DR-BERT [43]: A dynamic re-weighting BERT model is built, which tends to learn the dynamic aspect-oriented semantic information. • SSK-GAT+BERT [23]: A novel graph attention network model is proposed to incorporate syntactic, semantic, and knowledge-based features. The model is trained for 15 epochs with a batch size of 16 and a learning rate of 0.000022.…”
mentioning
confidence: 99%
“…On this basis, Yang et al [23] proposed a full graph attention neural network (FGANN), which also considers the influence of nodes other than neighboring nodes and can handle the graph classification task well. Zhang et al [24] noticed that in sentiment classification tasks, traditional models are not sensitive to syntactic structural information due to the complexity of syntactic analysis relationships, and the models lack the utilization of external sentiment knowledge. In this regard, a new graph attention neural network is proposed that realizes attention to multiple levels of syntax, semantics, and knowledge by introducing the mechanism of multi-head attention and acquires knowledge about sentiment by considering information such as syntax.…”
Section: Introductionmentioning
confidence: 99%