2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/ijcnn.2008.4633992
|View full text |Cite
|
Sign up to set email alerts
|

Empirical comparison of Dynamic Classifier Selection methods based on diversity and accuracy for building ensembles

Abstract: In the context of Ensembles or Multi-Classifier Systems, the choice of the ensemble members is a very complex task, in which, in some cases, it can lead to ensembles with no performance improvement. In order to avoid this situation, there is a great deal of research to find effective classifier member selection methods. In this paper, we propose a selection criterion based on both the accuracy and diversity of the classifiers in the initial pool. Also, instead of using a static selection method, we use a Dynam… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 17 publications
0
10
0
Order By: Relevance
“…Many scholars of the accuracy of dynamic ensemble algorithm conducted in-depth study [3] - [7], and proposed a variety of dynamic ensemble strategy: Overall Local Accuracy, Local Class Accuracy, A Priori Selection and A Posteriori Selection.…”
Section: A Dynamic Ensemble Methodsmentioning
confidence: 99%
“…Many scholars of the accuracy of dynamic ensemble algorithm conducted in-depth study [3] - [7], and proposed a variety of dynamic ensemble strategy: Overall Local Accuracy, Local Class Accuracy, A Priori Selection and A Posteriori Selection.…”
Section: A Dynamic Ensemble Methodsmentioning
confidence: 99%
“…As the clusters are fixed once for all, many different instances might share the same region of competence. In contrast, kNN methods give different regions of competence from one instance to another, which allows for more flexibility but at the expense of a higher computational cost [11].…”
Section: Evaluation and Selection Of The Best Classifiermentioning
confidence: 99%
“…by using a preselected committee of the individual classifier and making the final decision on the basis of a voting rule. 28 The random reference classifier used for dynamic selection was proposed in Ref. 29.…”
Section: Introductionmentioning
confidence: 99%