2022
DOI: 10.1016/j.ipm.2022.102953
|View full text |Cite
|
Sign up to set email alerts
|

Aspect sentiment analysis with heterogeneous graph neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 46 publications
(7 citation statements)
references
References 11 publications
0
7
0
Order By: Relevance
“…Liang et al [11] used constituent tree to build adjacent matrix for each phrase layer. Hete_GNNs [39] and Sentic GCN [19] introduced sentiment dictionary into words relationship. In addition, BERTbased context representation models [40]- [42] can improve the performance whatever it is a graph-based model or none graph-based model [43].…”
Section: Related Workmentioning
confidence: 99%
“…Liang et al [11] used constituent tree to build adjacent matrix for each phrase layer. Hete_GNNs [39] and Sentic GCN [19] introduced sentiment dictionary into words relationship. In addition, BERTbased context representation models [40]- [42] can improve the performance whatever it is a graph-based model or none graph-based model [43].…”
Section: Related Workmentioning
confidence: 99%
“…Heterogeneous global graph neural networks were utilized by Pang et al [32] to consider item transition patterns from other users' historical sessions for deducing user preferences from the current and historical sessions. Lu et al [33] applied heterogeneous graph attention networks for aspect sentiment analysis. Liang et al [34] designed a heterogeneous graph-based model for emotional conversation generation.…”
Section: Heterogeneous Graph Attention Network (Hgat)-related Workmentioning
confidence: 99%
“…For a single category, let TP be the number of correctly predicted samples, FP be the number of samples predicted to be the current category from other categories, and FN be the number of samples predicted to be other categories from the current category. Then, the calculation formulas for accuracy and MF1 are distributed as follows in (18) and (19).…”
Section: Datasets and Evaluation Metricsmentioning
confidence: 99%
“…In recent years, deep learning methods represented by neural networks have attracted increasing attention because they can automatically generate useful feature representations from aspects and their contexts, and can achieve better aspect-level sentiment classification without handcrafted features. In particular, the attention mechanism [9,10] and graph neural network [11][12][13][14][15] methods are widely used in aspect-level sentiment classification due to their ability to focus on aspect words in sentences and their ability to handle unstructured data [16][17][18][19][20][21][22][23]. For example, Su et al [16] proposed a progressive self-supervision attention learning approach for attentional aspect-level sentiment analysis.…”
Section: Introductionmentioning
confidence: 99%