Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1551
|View full text |Cite
|
Sign up to set email alerts
|

Capsule Network with Interactive Attention for Aspect-Level Sentiment Classification

Abstract: Aspect-level sentiment classification is a crucial task for sentiment analysis, which aims to identify the sentiment polarities of specific targets in their context. The main challenge comes from multi-aspect sentences, which express multiple sentiment polarities towards different targets, resulting in overlapped feature representation. However, most existing neural models tend to utilize static pooling operation or attention mechanism to identify sentimental words, which therefore insufficient for dealing wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(24 citation statements)
references
References 21 publications
0
24
0
Order By: Relevance
“…Following works include applying memory network-based ( Tang, Qin & Liu, 2016 ) and attention-based ( Chen et al, 2017 ) method to LSTM models, involving two stacked LSTMs ( Xu et al, 2020 ) and so on. More recent models such as capsule network ( Chen & Qian, 2019 ; Du et al, 2019 ), graph convolutional network model ( Zhang, Li & Song, 2019 ), graph attention network ( Wang et al, 2020 ), bi-level interactive graph convolution network ( Zhang & Qian, 2020 ) are also used for ABSA task. Zhu et al, (2019) have exploited the interaction between the aspect category and the contents under the guidance of both sentiment polarity and predefined categories, and the proposed aspect aware learning framework has achieved satisfying performance in ABSA.…”
Section: Literature Reviewsmentioning
confidence: 99%
“…Following works include applying memory network-based ( Tang, Qin & Liu, 2016 ) and attention-based ( Chen et al, 2017 ) method to LSTM models, involving two stacked LSTMs ( Xu et al, 2020 ) and so on. More recent models such as capsule network ( Chen & Qian, 2019 ; Du et al, 2019 ), graph convolutional network model ( Zhang, Li & Song, 2019 ), graph attention network ( Wang et al, 2020 ), bi-level interactive graph convolution network ( Zhang & Qian, 2020 ) are also used for ABSA task. Zhu et al, (2019) have exploited the interaction between the aspect category and the contents under the guidance of both sentiment polarity and predefined categories, and the proposed aspect aware learning framework has achieved satisfying performance in ABSA.…”
Section: Literature Reviewsmentioning
confidence: 99%
“…Our work is inspired by the work of capsule graph neural network [28], especially its application on aspect extraction [3,5]. These works mainly focus on capsule networks for aspect-level sentiment classification.…”
Section: Related Workmentioning
confidence: 99%
“…To investigate the impact of different components of our proposed adversarial multi-task learning framework, we conduct experiments based on RAM model and report the results of different structures 6 . As shown in Table 2, Simply incorporation synthetic samples (w/ synthetic) only leads to marginal improvements since some of the synthetic samples may introduce noise to the training process.…”
Section: Ablation Studymentioning
confidence: 99%