Proceedings of the 15th Conference of the European Chapter of The Association for Computational Linguistics: Volume 2 2017
DOI: 10.18653/v1/e17-2091
|View full text |Cite
|
Sign up to set email alerts
|

Attention Modeling for Targeted Sentiment

Abstract: Neural network models have been used for target-dependent sentiment analysis. Previous work focus on learning a target specific representation for a given input sentence which is used for classification. However, they do not explicitly model the contribution of each word in a sentence with respect to targeted sentiment polarities. We investigate an attention model to this end. In particular, a vanilla LSTM model is used to induce an attention value of the whole sentence. The model is further extended to differ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
93
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 207 publications
(93 citation statements)
references
References 16 publications
0
93
0
Order By: Relevance
“…Text. The baselines (e.g., SVM [8], CNN [15], Transformer [36], and BiLSTM-Attention [18]) solely relied on textual information and didn't perform well. Correspondingly, most graphembedding approaches with different mechanisms can improve task performance.…”
Section: Experiments Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Text. The baselines (e.g., SVM [8], CNN [15], Transformer [36], and BiLSTM-Attention [18]) solely relied on textual information and didn't perform well. Correspondingly, most graphembedding approaches with different mechanisms can improve task performance.…”
Section: Experiments Results and Analysismentioning
confidence: 99%
“…3. Bidirectional Recurrent Neural Network (BiLSTM) with Attention Mechanism [18]: We used a bidirectional LSTM to represent the word sequence in a review then calculated the weighted values over each word in reviews by an attention model. This baseline was denoted as BiLSTM-Attention.…”
Section: Experiments 41 Dataset and Experiments Settingmentioning
confidence: 99%
“…Various neural models (Dong et al, 2014;Nguyen and Shirai, 2015;Vo and Zhang, 2015;Tang et al, 2016a,b;Wang et al, 2016;Liu and Zhang, 2017;Chen et al, 2017) have been proposed for aspect-level sentiment classification. The main idea behind these works is to develop neural architectures that are able to learn continuous features and capture the intricate relation between a target and context words.…”
Section: Related Workmentioning
confidence: 99%
“…We use the mean vector of a span hidden features as a target vector and apply dot-wise attention over the sentence hidden features. This kind of attention mechanism is widely used in targeted sentiment analysis [17,18].…”
Section: Sentence Featuresmentioning
confidence: 99%