Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016
DOI: 10.18653/v1/d16-1023
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sentence Embeddings with Auxiliary Tasks for Cross-Domain Sentiment Classification

Abstract: In this paper, we study cross-domain sentiment classification with neural network architectures. We borrow the idea from Structural Correspondence Learning and use two auxiliary tasks to help induce a sentence embedding that supposedly works well across domains for sentiment classification. We also propose to jointly learn this sentence embedding together with the sentiment classifier itself. Experiment results demonstrate that our proposed joint model outperforms several state-of-theart methods on five benchm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
144
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 199 publications
(144 citation statements)
references
References 20 publications
0
144
0
Order By: Relevance
“…Most multitask learning studies have focused on improving the performance of the main task or all tasks by lower layer parameter sharing among multiple tasks (Zhang et al 2014;Yu and Jiang 2016;Liu et al 2015;Cheng, Fang, and Ostendorf 2015). For example, In the work of Zhang et al, head pose estimation and facial attribute inference tasks are used as the auxiliary tasks, which shares parameters except for the last layer's parameters that participate in each tasks' output.…”
Section: Multi-task Learningmentioning
confidence: 99%
“…Most multitask learning studies have focused on improving the performance of the main task or all tasks by lower layer parameter sharing among multiple tasks (Zhang et al 2014;Yu and Jiang 2016;Liu et al 2015;Cheng, Fang, and Ostendorf 2015). For example, In the work of Zhang et al, head pose estimation and facial attribute inference tasks are used as the auxiliary tasks, which shares parameters except for the last layer's parameters that participate in each tasks' output.…”
Section: Multi-task Learningmentioning
confidence: 99%
“…Existing domain adaptation tasks for sentiment analysis focus on traditional sentiment classification without considering the aspect (Blitzer, Dredze, and Pereira 2007;Pan et al 2010;Glorot, Bordes, and Bengio 2011;Chen et al 2012;Bollegala, Weir, and Carroll 2013;Yu and Jiang 2016;Li et al 2018b). In terms of data scarcity and the value of task, transfer learning is more urgent for aspect-level sentiment analysis that characterizes users' different preferences.…”
Section: Related Workmentioning
confidence: 99%
“…To address this problem, unsupervised domain adaptation methods can be applied. While existing methods focus on traditional cross-domain sentiment classification to learn shared representations for sentences or documents, including pivot-based methods (Blitzer et al, 2007;Pan et al, 2010;Bollegala et al, 2013;Yu and Jiang, 2016), auto-encoders (Glorot et al, 2011;Chen et al, 2012;Zhou et al, 2016), domain adversarial networks (Ganin et al, 2016;Li et al, , 2018c, or semi-supervised methods (He et al, 2018a). Due to the difficulties in fine-grained adaptation, there exist very few methods for cross-domain aspect extraction Ding et al, 2017), which acts as a sub-task of E2E-ABSA, or aspect and opinion co-extraction (Li et al, 2012;Wang and Pan, 2018) that focuses on detecting aspect and opinion words, while E2E-ABSA needs to analyze more complicated correspondences between them.…”
Section: Related Workmentioning
confidence: 99%