2015
DOI: 10.1007/s12652-015-0296-5
|View full text |Cite
|
Sign up to set email alerts
|

Bagging based ensemble transfer learning

Abstract: Nowadays, transfer learning is one of the main research areas in machine learning that is helpful for labeling the data with low cost. In this paper, we propose a novel bagging-based ensemble transfer learning (BETL). The BETL framework includes three operations: Initiate, Update, and Integrate. In the Initiate operation, we use bootstrap sampling to divide the source data into many subsets, and add the labeled data from the target domain into these subsets separately so that the source data and the target dat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(11 citation statements)
references
References 22 publications
0
11
0
Order By: Relevance
“…1The transfer precision is high for objects which are easy to classify, but when the object is difficult to distinguish, its transfer precision will be very unreliable. 2The weak classifier used in the corresponding literature [2,8] was the NB classifier, but in this study, the effect of this classifier was found to be not ideal. After comparing decision tree, NB, and SVM classifiers, it was found that using SVM as the weak classifier could allow TrBagg and BETL to reach a higher precision.…”
Section: Discussionmentioning
confidence: 83%
See 3 more Smart Citations
“…1The transfer precision is high for objects which are easy to classify, but when the object is difficult to distinguish, its transfer precision will be very unreliable. 2The weak classifier used in the corresponding literature [2,8] was the NB classifier, but in this study, the effect of this classifier was found to be not ideal. After comparing decision tree, NB, and SVM classifiers, it was found that using SVM as the weak classifier could allow TrBagg and BETL to reach a higher precision.…”
Section: Discussionmentioning
confidence: 83%
“…The sample usage in each experiment is shown in Table 1 below: Although TrBagg and BETL only need at least one labeled sample in the target domain, the stability of these two methods is poor when the target domain labeled samples are too few, so we expanded them to five in this experiment. For the selection of the weak classifier, although both TrBagg and BETL use a naive Bayes (NB) classifier as the weak classifier in the corresponding references [2,8], in order to achieve a higher precision for these two methods, after comparing various weak classifiers, we used a support vector machine (SVM) classifier as the weak classifier for TrBagg and BETL.…”
Section: Experimental Settingmentioning
confidence: 99%
See 2 more Smart Citations
“…BP neural network is one of the most widely used neural network models, which can classify prediction problems and possess highly nonlinear mapping and accurate function fitting, as well as the ability of self-organizing, self-learning, self-adaptive and parallel processing [21]. Bagging algorithm is an ensemble learning algorithm, which is an ensemble method to improve the accuracy of learning algorithms by sampling samples and feature attributes to generate multiple independent models [22]. The problem in this paper can be formally described as follows: Given coal and gas outburst data set D containing m data records…”
Section: Bp Neural Network Ensemble Learning Model Based On Baggmentioning
confidence: 99%