2019
DOI: 10.1016/j.patcog.2018.07.037
|View full text |Cite
|
Sign up to set email alerts
|

FIRE-DES++: Enhanced online pruning of base classifiers for dynamic ensemble selection

Abstract: Dynamic Ensemble Selection (DES) techniques aim to select one or more competent classifiers for the classification of each new test sample. Most DES techniques estimate the competence of classifiers using a given criterion over the region of competence of the test sample, usually defined as the set of nearest neighbors of the test sample in the validation set. Despite being very effective in several classification tasks, DES techniques can select classifiers that classify all samples in * Corresponding author … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(22 citation statements)
references
References 39 publications
0
22
0
Order By: Relevance
“…Moreover, to address the problem of borderline samples in the local region, the Frienemy Indecision REgion-DES (FIRE-DES) method introduces a notion of "frienemy" by pre-selecting classifiers that correctly classify at least one pair of samples from different classes [53]. FIRE-DES++ is an extension of FIRE-DES to ameliorate the noise sensitivity and indecision region problems [54]. It removes the noises and reduces the overlap of classes in the validation set and then applies K-Nearest Neighbors Equality (KNNE) to define a more balanced RoC.…”
Section: Region Of Competencementioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, to address the problem of borderline samples in the local region, the Frienemy Indecision REgion-DES (FIRE-DES) method introduces a notion of "frienemy" by pre-selecting classifiers that correctly classify at least one pair of samples from different classes [53]. FIRE-DES++ is an extension of FIRE-DES to ameliorate the noise sensitivity and indecision region problems [54]. It removes the noises and reduces the overlap of classes in the validation set and then applies K-Nearest Neighbors Equality (KNNE) to define a more balanced RoC.…”
Section: Region Of Competencementioning
confidence: 99%
“…and ℎ are META-DES parameters that were set to 5 and 1, respectively, and ∆ is a DDES-I parameter that was set to 0.1 × . See similar settings in [48], [53], [54].…”
Section: Table 3 Base Classifiers and Hyperparameter Poolsmentioning
confidence: 99%
See 1 more Smart Citation
“…In contrast, in dynamic ensemble classifiers (DES), the decision of which classifiers should be combined for generating the final output is postponed until generalization phase [ 39 – 45 ]. In other words, there will be not a fixed subset of classifiers which applies to any test instance.…”
Section: Related Workmentioning
confidence: 99%
“…Thus, defining an area of competence with unbiased validation data instead of biased data. The results show significant improvement when compared to the existing dynamic pruning methods [46] Bernard et al [2009] proposed a different approach to add the trees to random forest based on feature selection strategy. The aim was to point out that random forest subsets performs better than original random forest.…”
Section: Random Forest-specific Pruning Techniquesmentioning
confidence: 99%