2017
DOI: 10.1016/j.neucom.2017.01.067
|View full text |Cite
|
Sign up to set email alerts
|

Classifier ensembles for image identification using multi-objective Pareto features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
14
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 39 publications
(16 citation statements)
references
References 45 publications
1
14
0
1
Order By: Relevance
“…In Multi-Train, classifier diversity is gleaned by simultaneously manipulating data, manipulating input attributes, using various machine learning algorithms, and various models. Heterogeneous ensembles have proved to be more effective in achieving diversity [23,26,27], which is also empirically confirmed by statistically better results than those of the tri-training algorithm on the 12 datasets used in this work. Second, the proposed method predicts a probability of a data having a particular label, which can then be used to select the most confidently predicted unlabeled data to be added to labeled data.…”
Section: Introductionsupporting
confidence: 71%
See 1 more Smart Citation
“…In Multi-Train, classifier diversity is gleaned by simultaneously manipulating data, manipulating input attributes, using various machine learning algorithms, and various models. Heterogeneous ensembles have proved to be more effective in achieving diversity [23,26,27], which is also empirically confirmed by statistically better results than those of the tri-training algorithm on the 12 datasets used in this work. Second, the proposed method predicts a probability of a data having a particular label, which can then be used to select the most confidently predicted unlabeled data to be added to labeled data.…”
Section: Introductionsupporting
confidence: 71%
“…To benefit from the improved accuracy of ensemble learning [20,21,22,23], techniques that combine SSL with ensembles have recently attracted much interest. For example, Shao and Tian [24] proposed a selective SSL ensemble learning method based on the distance to the model.…”
Section: Introductionmentioning
confidence: 99%
“…1. Heterogeneous ensembles that use both Input Input (1) Model (1) different types of models and different input features created by feature selection and feature extraction have shown better performance than those using single type of inputs [60], [61], partly due to the inherent better ability to create diversity [36]. Therefore, this work also employs different features and different types of models for promoting ensemble diversity without impairing accuracy, hoping to enhance the ability of the ensemble surrogate in uncertainty estimation.…”
Section: Heterogeneous Ensemble Generationmentioning
confidence: 99%
“…In machine learning, it has been theoretically proved that when a right balance between diversity and accuracy is met, an ensemble can provide more accurate predictions than any of its member alone [33]. Ensembles are often categorized into homogeneous ensembles (consisting of the same type of models typically generated by manipulating the data, like random sampling [34]) and heterogeneous ensembles (composed of different types of models [35] or different input features [36]).…”
Section: Introductionmentioning
confidence: 99%
“…The definition of the best technique for a particular classification application is a challenging problem and has been addressed by several authors [15][16][17][18][19][20][21]. This problem has been treated as a meta-learning problem [17,22], automatic selection of machine learning (auto-ML) [16,18,20] or an optimization problem [15,19,21]. In [18], for instance, the authors used an auto-ML technique to define the best classification algorithm, along with the best set of parameters, for a specific classification problem.…”
Section: State-of-the-art: Optimization Techniques For Classifier Ensmentioning
confidence: 99%