Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.22
|View full text |Cite
|
Sign up to set email alerts
|

Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training

Abstract: Aspect-based sentiment analysis aims to identify the sentiment polarity of a specific aspect in product reviews. We notice that about 30% of reviews do not contain obvious opinion words, but still convey clear human-aware sentiment orientation, which is known as implicit sentiment. However, recent neural networkbased approaches paid little attention to implicit sentiment entailed in the reviews. To overcome this issue, we adopt Supervised Contrastive Pre-training on large-scale sentimentannotated corpora retri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
27
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(27 citation statements)
references
References 31 publications
0
27
0
Order By: Relevance
“…The transcripts used for the AD diagnosis of spontaneous speech contain no polarity markers; however, most previous studies in this field generally pay little attention to implicit sentiment expressions. The study ( 44 ) used supervised contrastive learning to capture implicit sentiment using an advanced method. That is, the expressions with the same sentiment polarity were pulled together, and those with different sentiment orientations were pushed apart.…”
Section: Future Workmentioning
confidence: 99%
“…The transcripts used for the AD diagnosis of spontaneous speech contain no polarity markers; however, most previous studies in this field generally pay little attention to implicit sentiment expressions. The study ( 44 ) used supervised contrastive learning to capture implicit sentiment using an advanced method. That is, the expressions with the same sentiment polarity were pulled together, and those with different sentiment orientations were pushed apart.…”
Section: Future Workmentioning
confidence: 99%
“…In recent years, a large number of outstanding opensource ASC (Li et al, 2021a;Tian et al, 2021;Li et al, 2021b; and ATESC (Li et al, 2018b;Xu et al, 2018;Ma et al, 2019;Yang, 2019;…”
Section: Related Workmentioning
confidence: 99%
“…Sentiment-enhanced word embedding To solve the above problems, we expect to find a method to build the mapping relationship between words and their sentiment orientations in sentiment lexicons and fuse sentiment information into these words. Fortunately, we are inspired by translations in the embedding space (TransE) [19] and contrastive learning [20,21]. TransE is a knowledge graph embedding method.…”
Section: Sentiment Orientationmentioning
confidence: 99%
“…Meanwhile, contrastive learning narrows the distance between positive examples and increases the distance between negative examples in vector space. The bootstrap your own latent (BYOL) [20] minimizes the distance between two similar images; the supervised contrastive pre-training (SCAPT) [21] used supervised contrastive learning to cluster explicit and implicit positive sentiments, cluster explicit and implicit negative sentiments, and separated these two clusters. By borrowing these ideas, we propose a novel sentiment-enhanced word embedding (S-EWE) method to improve the performances of sentiment classification models.…”
Section: Sentiment Orientationmentioning
confidence: 99%