2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2018
DOI: 10.1109/icassp.2018.8461414
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised and Transfer Learning Approaches for Low Resource Sentiment Classification

Abstract: Sentiment classification involves quantifying the affective reaction of a human to a document, media item or an event. Although researchers have investigated several methods to reliably infer sentiment from lexical, speech and body language cues, training a model with a small set of labeled datasets is still a challenge. For instance, in expanding sentiment analysis to new languages and cultures, it may not always be possible to obtain comprehensive labeled datasets. In this paper, we investigate the applicati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 17 publications
0
18
0
Order By: Relevance
“…Low-resource NLP Previous work in low-resource NLP tasks includes feature-engineering (Tan and Zhang, 2008) which requires a recurring effort when adapting to a new data set. Another approach is to transfer knowledge across domains to increase the amount of data that is available for training (Zoph et al, 2016;Nguyen and Chiang, 2017;Kocmi and Bojar, 2018;Yang et al, 2017;Gupta et al, 2018). One of these approaches relied on adversarial training (Goodfellow et al, 2014) to learn a domain adaptive classifier (Ganin et al, 2016) in another domain or language where training data was plentiful while ensuring that the model generalizes to the low-resource domain (Chen et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Low-resource NLP Previous work in low-resource NLP tasks includes feature-engineering (Tan and Zhang, 2008) which requires a recurring effort when adapting to a new data set. Another approach is to transfer knowledge across domains to increase the amount of data that is available for training (Zoph et al, 2016;Nguyen and Chiang, 2017;Kocmi and Bojar, 2018;Yang et al, 2017;Gupta et al, 2018). One of these approaches relied on adversarial training (Goodfellow et al, 2014) to learn a domain adaptive classifier (Ganin et al, 2016) in another domain or language where training data was plentiful while ensuring that the model generalizes to the low-resource domain (Chen et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Adapting pre-trained models The effectiveness of transfer learning in low-resource settings was previously demonstrated for machine translation (Zoph et al, 2016;Nguyen and Chiang, 2017;Kocmi and Bojar, 2018), sequence tagging (Yang et al, 2017) and sentiment classification (Gupta et al, 2018 close to target corpus. Some previous work has analyzed the performance behavior of the BERT model in different scenarios.…”
Section: Related Workmentioning
confidence: 99%
“…Semisupervised learning is a fundamental solution that allows learning from heterogeneous unlabeled examples spread via social network in combination with typically sets of labeled data. In this sense, several studies used semisupervised techniques to primarily develop automated corpora [23] by utilizing word representation capabilities in two ways: (1) generating context-depend embeddings, (2) transferring knowledge from pretrained embeddings or fusing weighted distributed features [24][25][26]. The ultimate problem is that these corpora result in low medical entities recognition recall due to the failure of recognizing drugrelated components and their aspects such as drug reactions and indications Extracting drug-related explicit aspects may be failed due to the existence of adverse drug reaction (ADR) multi-word expressions, leading to a higher accumulation of false positives and negatives.…”
Section: Aspect-based Sentiment Analysismentioning
confidence: 99%
“…Survey articles such as ones by Liu et al [12] and Medhat et al [13] provide a summary of such techniques. Sentiment analysis has been previously studied in low resource settings by application of methods such as transfer learning [14] and semi-supervised learning [15]. The set of techniques proposed for sentiment analysis in absence of labeled data include manifold regularization [14], semi-supervised recursive autoencoders [16], document word co-regularization [17] and latent variable models [18].…”
Section: Introductionmentioning
confidence: 99%
“…Sentiment analysis has been previously studied in low resource settings by application of methods such as transfer learning [14] and semi-supervised learning [15]. The set of techniques proposed for sentiment analysis in absence of labeled data include manifold regularization [14], semi-supervised recursive autoencoders [16], document word co-regularization [17] and latent variable models [18]. On the other hand, GAN models were proposed in 2014 and a tutorial by Goodfellow [19] provides a background on GANs.…”
Section: Introductionmentioning
confidence: 99%