Proceedings of the 18th ACM Conference on Information and Knowledge Management 2009
DOI: 10.1145/1645953.1646121
|View full text |Cite
|
Sign up to set email alerts
|

Large margin transductive transfer learning

Abstract: Recently there has been increasing interest in the problem of transfer learning, in which the typical assumption that training and testing data are drawn from identical distributions is relaxed. We specifically address the problem of transductive transfer learning in which we have access to labeled training data and unlabeled testing data potentially drawn from different, yet related distributions, and the goal is to leverage the labeled training data to learn a classifier to correctly predict data from the te… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
61
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 105 publications
(62 citation statements)
references
References 31 publications
1
61
0
Order By: Relevance
“…There are three baseline methods tested where different classifiers are trained with the labeled source data. There are five transfer learning methods tested against, which include methods by Ling [66], Pan [83], Pan [87], Quanz [94], and Xiao [133]. The order of performance from best to worst is ARTL, Xiao [133], Pan [87], Pan [83], Quanz [94] and Ling [66] (tie), and the baseline approaches.…”
Section: Asymmetric Feature-based Transfer Learningmentioning
confidence: 99%
“…There are three baseline methods tested where different classifiers are trained with the labeled source data. There are five transfer learning methods tested against, which include methods by Ling [66], Pan [83], Pan [87], Quanz [94], and Xiao [133]. The order of performance from best to worst is ARTL, Xiao [133], Pan [87], Pan [83], Quanz [94] and Ling [66] (tie), and the baseline approaches.…”
Section: Asymmetric Feature-based Transfer Learningmentioning
confidence: 99%
“…With the encouraging successes in applying metric learning techniques to various problems (e.g., machine translation [9], [10], multimedia information retrieval [13], and visual recognition [16], [28]), the metric learning methods have also found its way into the visual recognition tasks, especially in the form of domain adaptation and transfer learning [26], [28].…”
Section: Related Workmentioning
confidence: 99%
“…Unlike conventional learning assumptions where the training data and the testing data follow the identical probability distribution [26], this problem requires techniques that can incorporate the beneficial information from the auxiliary view into the training of a classification model that works only on the main view. We shall emphasize that this problem is different from the domain adaptation or the transfer learning problems [9], [16], [28].…”
Section: Introductionmentioning
confidence: 99%
“…Large-margin transductive transfer learning (LMPROJ) [28] is most related to our model, but with two major difference: (1) it uses the same mapping function for the source and target domains; (2) it utilizes projected mean discrepancy distance instead of squared loss as the reconstruction loss. The basic assumption is that in latent space, the centroid of examples in the source domain should be as close to the one in the target domain as possible.…”
Section: Connections To Existing Workmentioning
confidence: 99%
“…This learning problem, i.e., abundant labeled example in source domain and unlabeled examples in the target domain, is also know as transductive transfer learning. There have been several studies to address the problem [2], [28], but it is far from being solved for real applications and is therefore the main focus of this paper.…”
Section: Introductionmentioning
confidence: 99%