2017
DOI: 10.1016/j.neucom.2017.03.063
|View full text |Cite
|
Sign up to set email alerts
|

Multi-train: A semi-supervised heterogeneous ensemble classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 36 publications
(11 citation statements)
references
References 24 publications
0
11
0
Order By: Relevance
“…1. Heterogeneous ensembles that use both Input Input (1) Model (1) different types of models and different input features created by feature selection and feature extraction have shown better performance than those using single type of inputs [60], [61], partly due to the inherent better ability to create diversity [36]. Therefore, this work also employs different features and different types of models for promoting ensemble diversity without impairing accuracy, hoping to enhance the ability of the ensemble surrogate in uncertainty estimation.…”
Section: Heterogeneous Ensemble Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…1. Heterogeneous ensembles that use both Input Input (1) Model (1) different types of models and different input features created by feature selection and feature extraction have shown better performance than those using single type of inputs [60], [61], partly due to the inherent better ability to create diversity [36]. Therefore, this work also employs different features and different types of models for promoting ensemble diversity without impairing accuracy, hoping to enhance the ability of the ensemble surrogate in uncertainty estimation.…”
Section: Heterogeneous Ensemble Generationmentioning
confidence: 99%
“…In machine learning, it has been theoretically proved that when a right balance between diversity and accuracy is met, an ensemble can provide more accurate predictions than any of its member alone [33]. Ensembles are often categorized into homogeneous ensembles (consisting of the same type of models typically generated by manipulating the data, like random sampling [34]) and heterogeneous ensembles (composed of different types of models [35] or different input features [36]).…”
Section: Introductionmentioning
confidence: 99%
“…The performance of supervised classifiers has been explored in intrusion detection [29], robotics [18], semantic web [19], human posture recognition [30], face recognition [20], biomedical data classification [31], handwritten character recognition [22] and land cover classification [21]. Furthermore, an innovative semi-supervised heterogeneous ensemble classifier called Multi-train [32] was also proposed, where a justifiable comparison was made with other supervised classifiers, such as k-Nearest Neighbour (kNN), J48, Naïve Bayes, and random tree. Multi-train was also successfully achieved, and its prediction accuracy of unlabeled data was improved, which, therefore, can reduce the risk of incorrectly labeling the unlabeled data.…”
Section: Related Workmentioning
confidence: 99%
“…In this section, we carry out comparisons of WEDA and other related methods: adaptive semi-unsupervised weighted oversampling (ASUWO) [53], the postprocessing technique for a support vector machine (BSVM) [54], a minimally spanned support vector machine (MSSVM) [55], a semi-supervised heterogeneous ensemble classifier (multi-train) [56], SSO-SMOTE-SSO [57], and vote-boosting ensembles (VBensembles) [58]. The main comparisons have three aspects: precision, recall rate, and execution time.…”
Section: Comparison With Other Algorithmsmentioning
confidence: 99%