Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016
DOI: 10.18653/v1/p16-1155
|View full text |Cite
|
Sign up to set email alerts
|

Cross-domain Text Classification with Multiple Domains and Disparate Label Sets

Abstract: Advances in transfer learning have let go the limitations of traditional supervised machine learning algorithms for being dependent on annotated training data for training new models for every new domain. However, several applications encounter scenarios where models need to transfer/adapt across domains when the label sets vary both in terms of count of labels as well as their connotations. This paper presents first-of-its-kind transfer learning algorithm for cross-domain classification with multiple source d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…should be classified as the topic 'Apple' although it does not contain the keyword 'Apple' and the keyword 'Tim Cook' is not contained in the training samples. In other words, the reliable classifier should learn decision rules that generalize across domains (Fei and Liu 2015;Bhatt, Semwal, and Roy 2015;Bhatt, Sinha, and Roy 2016). This problematic phenomenon frequently happens in realworld datasets.…”
Section: Class : Applementioning
confidence: 99%
“…should be classified as the topic 'Apple' although it does not contain the keyword 'Apple' and the keyword 'Tim Cook' is not contained in the training samples. In other words, the reliable classifier should learn decision rules that generalize across domains (Fei and Liu 2015;Bhatt, Semwal, and Roy 2015;Bhatt, Sinha, and Roy 2016). This problematic phenomenon frequently happens in realworld datasets.…”
Section: Class : Applementioning
confidence: 99%
“…Even if Word2Vec models are more flexible as regards the training domain, Doc2vec models are more advantageous when document size is large (Lau & Baldwin, 2016). However, it is supposed that both neural networks outperform traditional TF-IDF representation (Bhatt et al, 2016).…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the text field, researchers have proposed various methods to tackle this problem based on non-deep neural networks [4,[28][29][30][31][32][33][34][35][36]. Blitzer et al presented structural correspondence learning (SCL) for transfer learning [4].…”
Section: Cross-domain Transfer Learningmentioning
confidence: 99%
“…Bhatt et al presented a cross-domain classification method. It can learn an accurate model for the new unlabeled target domain given labeled data from multiple source domains where all domains have (possibly) different label sets [34]. Qu et al proposed a transfer learning-based approach to named entity recognition in novel domains with label mismatch over a source domain [35].…”
Section: Cross-domain Transfer Learningmentioning
confidence: 99%