2011
DOI: 10.1007/s00521-011-0737-9
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic selection approaches for multiple classifier systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
50
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 89 publications
(51 citation statements)
references
References 17 publications
1
50
0
Order By: Relevance
“…On the other hand, Dos Santos et al [10] introduce a two step DES method: in the first step, highly accurate candidates ensembles are selected; in the second step, among those ensembles, for each test sample, the ensemble with the largest confidence level is selected. In a further work Cavalin et al [7] extend the previous work and they adapt it to Dynamic Multistage Organization strategy.…”
Section: Dynamic Classifier Selection (Dcs)mentioning
confidence: 98%
“…On the other hand, Dos Santos et al [10] introduce a two step DES method: in the first step, highly accurate candidates ensembles are selected; in the second step, among those ensembles, for each test sample, the ensemble with the largest confidence level is selected. In a further work Cavalin et al [7] extend the previous work and they adapt it to Dynamic Multistage Organization strategy.…”
Section: Dynamic Classifier Selection (Dcs)mentioning
confidence: 98%
“…Firstly, the best algorithm is different for each dataset [14][15][16]. The best algorithm also varies from datapoint to datapoint within a dataset [17,18]. In addition, giving variety to a dataset, such as sampling, partitioning, and decomposition, also makes the best algorithm to be different [5].…”
Section: Using Different Classifiersmentioning
confidence: 99%
“…The goal of dynamic selection then is to find a subset of classifiers C * (C * ⊆C) that correctly classify a given unknown pattern Q. In the literature [16], the subset C * may be composed of a single classifier [9,43] or an ensemble of classifiers [7,20]. In general, selection is performed by estimating the competence of the classifier available in the pool on local regions of the feature space.…”
Section: Dynamic Selection Of Classifiersmentioning
confidence: 99%
“…The feature space is divided into different partitions, and the most capable classifiers for a given unknown pattern Q, are determined. Regarding the competence measures, the literature shows that they may be based on accuracy (overall local accuracy or local class accuracy) [43], probabilistic information [9], classifier behavior computed on the output profiles [7,10], and oracle information [20,22].…”
Section: Dynamic Selection Of Classifiersmentioning
confidence: 99%