2020
DOI: 10.1109/tnnls.2019.2945116
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Pruning Based on Objection Maximization With a General Distributed Framework

Abstract: Ensemble pruning, selecting a subset of individual learners from an original ensemble, alleviates the deficiencies of ensemble learning on the cost of time and space. Accuracy and diversity serve as two crucial factors while they usually conflict with each other. To balance both of them, we formalize the ensemble pruning problem as an objection maximization problem based on information entropy. Then we propose an ensemble pruning method including a centralized version and a distributed version, in which the la… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(11 citation statements)
references
References 49 publications
0
11
0
Order By: Relevance
“…A machine learning model is considered accurate if it has a good generalization ability on unseen instances. In contrast, ML models are diverse if their errors on unseen instances are not the same [47]. Therefore, diversity is seen as the difference between base learners in an ensemble [50].…”
Section: Overview Of Ensemble Learningmentioning
confidence: 99%
“…A machine learning model is considered accurate if it has a good generalization ability on unseen instances. In contrast, ML models are diverse if their errors on unseen instances are not the same [47]. Therefore, diversity is seen as the difference between base learners in an ensemble [50].…”
Section: Overview Of Ensemble Learningmentioning
confidence: 99%
“…Clustering-based methods do not need a labeled pruning set as the other methods do, as it clusters the models based on a distance measurement (could be done on artificial data) and prunes the clusters until k members are left. Optimization-based ensemble pruning methods like Centralized Objection Maximization for Ensemble Pruning (COMEP) by Bian, Wang, et al starts with a single ensemble member and optimizes an objective iteratively for each new member [32]. Other methods like genetic algorithms [33] can be used as optimization-based ensemble pruners [31] as well.…”
Section: B Related Workmentioning
confidence: 99%
“…The credibility thresholds of the base classifiers are found by minimizing the empirical 0˘1 loss on the entire training observations. The optimization-based approach to ensemble pruning is proposed in [2], where from an information entropy perspective an objective function used in the selection process is proposed. This function takes diversity and accuracy into consideration both implicitly and simultaneously.…”
Section: Related Workmentioning
confidence: 99%