2022
DOI: 10.1109/access.2022.3214233
|View full text |Cite
|
Sign up to set email alerts
|

Aspect-Level Sentiment Analysis Using CNN Over BERT-GCN

Abstract: Context-based GCNs have achieved relatively good effectiveness in the sentiment analysis task, especially aspect-level sentiment analysis (ALSA). However, the previous context-based GCNs for ALSA often used GCNs with the following limitations: (i) Using GCNs limited to a few layers (two or three) due to the vanishing gradient, limiting their performance. (ii) Not considering helpful information about the hidden context between the words. To solve these limitations, this paper proposes a novel CNN over the BERT… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(9 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…The model includes self-supervised domain-specific fine-tuning of the BERT model and a crossdomain evaluation. 10) CNN over BERT-GCN [15]: This model adds a convolutional layer after the GCN layer. In this way, CNN can be used to analyze the important features in the sentence and the aspect emotion in the input sentence.…”
Section: ) Capsar [12]: This Model Injects Aspect Information Into Th...mentioning
confidence: 99%
See 1 more Smart Citation
“…The model includes self-supervised domain-specific fine-tuning of the BERT model and a crossdomain evaluation. 10) CNN over BERT-GCN [15]: This model adds a convolutional layer after the GCN layer. In this way, CNN can be used to analyze the important features in the sentence and the aspect emotion in the input sentence.…”
Section: ) Capsar [12]: This Model Injects Aspect Information Into Th...mentioning
confidence: 99%
“…Most of these models use the attention mechanism and skip connection of ResNet to deepen the depth of the network, but this method may hinder the information flow in the network, resulting in the model can not make full use of the extracted features of each layer. CNN over BERT-GCN [15] proposed to use the combination of BERT and BiLSTM to take into account the useful information hidden in context, and exploit the GCN model with multiple convolutional layers to capture the context features. ASBABM [16] proposed an aspect-level sentiment analysis model based on BERT and multi-level attention mechanisms.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, deep learning models, such as Convolutional Neural Network (CNN) [10] , Long Short-Term Memory (LSTM) [11] , Gated Recurrent Unit (GRU) [12] , Attention mechanism (Attention) [13] , etc., have achieved better results in the field of sentiment analysis. These models can effectively use a large amount of text data to capture contextual and emotional information, thus improving the accuracy of sentiment analysis.…”
Section: Related Workmentioning
confidence: 99%
“…The model uses only sentence sequences as the input to the BERT encoder, and proposes and uses a new interactive gate mechanism, which becomes common gate, to effectively reduce the interference of noisy words, obtain contextual information, and enhance the sentiment prediction effect on aspects. PHAN et al [10] proposed a new BERT-GCN model that combines the BERT and BILSTM models to address the limitations of context-based graph convolutional networks for aspect-based sentiment analysis tasks. By adding convolutional layers to the convolutional neural network (CNN) model, the model can handle the drawback of GCN being limited to only a few layers and further improve the performance by considering the beneficial information of hidden context between words.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation