Proceedings of the 19th ACM International Conference on Information and Knowledge Management 2010
DOI: 10.1145/1871437.1871488
|View full text |Cite
|
Sign up to set email alerts
|

A robust semi-supervised classification method for transfer learning

Abstract: The transfer learning problem of designing good classifiers with a high generalization ability by using labeled samples whose distribution is different from that of test samples is an important and challenging research issue in the fields of machine learning and data mining. This paper focuses on designing a semi-supervised classifier trained by using unlabeled samples drawn by the same distribution as test samples, and presents a semi-supervised classification method to deal with the transfer learning problem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…Two strategies are used for services classification: supervised and unsupervised. For a supervised classification, such as the ones suggested in Fujino et al, 2010;He et al, 2004;Oldham et al, 2005;Crasso et al, 2008), two phases are required: training and classification phases. The training phase is needed to build a classifier from a collection of categorized services, while the classification phase associates a new service to one or more classes.…”
Section: Related Workmentioning
confidence: 99%
“…Two strategies are used for services classification: supervised and unsupervised. For a supervised classification, such as the ones suggested in Fujino et al, 2010;He et al, 2004;Oldham et al, 2005;Crasso et al, 2008), two phases are required: training and classification phases. The training phase is needed to build a classifier from a collection of categorized services, while the classification phase associates a new service to one or more classes.…”
Section: Related Workmentioning
confidence: 99%
“…, s K } given the feature vector x of a word instance is modeled by a combination of discriminative and generative models, P d (s|x; W) and p g (x, s; ), where W and are the respective parameters of these models. By applying the classifier form and training method presented in Fujino et al [2010], we define P(s|x) as the following. We also provide objective function J for the parameter estimation of P(s k |x; W, , β) by using labeled and unlabeled datasets,…”
Section: Mhle-based Semi-supervised Wsd Classifiermentioning
confidence: 99%
“…The local optimal solution of W and around an initial value can be obtained by an iterative process such as the EM algorithm [Dempster et al 1977]. Namely, the MHLE-based semi-supervised WSD classifier is constructed by combining the discriminative and generative models trained on both labeled and unlabeled samples (See Fujino et al [2010] for details of the combination and training methods).…”
Section: Mhle-based Semi-supervised Wsd Classifiermentioning
confidence: 99%
See 1 more Smart Citation
“…As the machine learning theory continues to develop, it is impossible to meet the needs of classifier training merely by relying on a limited number of labeled samples and it becomes extremely important to make use of unlabeled samples for co-training. Fujino et al [6] designed a semi-supervised classifier (MHLE) trained by using unlabeled samples drawn by the same distribution as testing samples, and presented a semi-supervised classification method to deal with the transfer learning problem. The MHLE needs to designate a discriminative and generative model for a specific problem beforehand, which requires much domain knowledge.…”
Section: Introductionmentioning
confidence: 99%