2023
DOI: 10.2298/csis220325052z
|View full text |Cite
|
Sign up to set email alerts
|

TS-GCN: Aspect-level sentiment classification model for consumer reviews

Abstract: The goal of aspect-level sentiment classification (ASC) task is to obtain the sentiment polarity of aspect words in the text. Most existing methods ignore the implicit aspects, resulting in low classification accuracy. To improve the accuracy, this paper proposes a classification model for consumer reviews, abbreviated as TS-GCN (Truncated history attention and Selective transformation network-Graph Convolutional Networks). TS-GCN can classify sentiment from both explicit and implicit aspect… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 38 publications
0
2
0
Order By: Relevance
“…This approach promotes the sentimental classification of aspect-level. Graph convolutional neural networks (GCNs) [ 39 ] and BERT [ 40 ] models have exhibited remarkable performance in text and sentiment classification. For instance, Wang et al [ 41 ] proposed the KGBGCN model, which effectively addresses the challenge of capturing key information in lengthy documents and overcoming the deficiency in classification accuracy due to the lack of domain-specific knowledge.…”
Section: Related Workmentioning
confidence: 99%
“…This approach promotes the sentimental classification of aspect-level. Graph convolutional neural networks (GCNs) [ 39 ] and BERT [ 40 ] models have exhibited remarkable performance in text and sentiment classification. For instance, Wang et al [ 41 ] proposed the KGBGCN model, which effectively addresses the challenge of capturing key information in lengthy documents and overcoming the deficiency in classification accuracy due to the lack of domain-specific knowledge.…”
Section: Related Workmentioning
confidence: 99%
“…Fan et al [13] argued that when both aspect words and contexts are long, the simplicity of the weighted sum attention mechanism would introduce some noise; thus, they proposed a multi-granularity attention mechanism to reduce irrelevant information. Since the application of BERT and GCN in the field of NLP, a large number of works based on BERT and GCN have also studied ABSA, such as Zhou et al [10] and Zhang et al [14].…”
Section: Review Of Literaturementioning
confidence: 99%