2014
DOI: 10.1016/j.eswa.2013.08.059
|View full text |Cite
|
Sign up to set email alerts
|

Filter-based optimization techniques for selection of feature subsets in ensemble systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
22
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 45 publications
(22 citation statements)
references
References 20 publications
0
22
0
Order By: Relevance
“…The advantages of wrapper-based methods include that they take into account the ability of a classifier model to approximate or accommodate the distribution of a proposed feature [37], hence the final selected feature subset, when used with the desired classifier, often performs better than features selected by a filter-based method [35]. The main limitation of wrapper-based methods is that they are computationally intensive, which can be an issue when the data set contains a large number of features [38]. Compared with filter-based methods, they are susceptible to a higher risk of over-fitting, or require that additional data is acquired with which to perform final validation of the trained classifier model [36].…”
Section: Feature Selectionmentioning
confidence: 98%
See 1 more Smart Citation
“…The advantages of wrapper-based methods include that they take into account the ability of a classifier model to approximate or accommodate the distribution of a proposed feature [37], hence the final selected feature subset, when used with the desired classifier, often performs better than features selected by a filter-based method [35]. The main limitation of wrapper-based methods is that they are computationally intensive, which can be an issue when the data set contains a large number of features [38]. Compared with filter-based methods, they are susceptible to a higher risk of over-fitting, or require that additional data is acquired with which to perform final validation of the trained classifier model [36].…”
Section: Feature Selectionmentioning
confidence: 98%
“…Compared with filter-based methods, they are susceptible to a higher risk of over-fitting, or require that additional data is acquired with which to perform final validation of the trained classifier model [36]. In addition, wrapper-based feature selection must be re-run each time a new classifier model is used [38].…”
Section: Feature Selectionmentioning
confidence: 99%
“…Recently, Santana and Canuto [16] solved a slightly different problem of the selection of a subset of features for ensemble systems. The authors used bio-inspired algorithms including Ant Colony Optimization, Particle Swarm Optimization, and Genetic Algorithms to solve the problem and the experiments showed that all three algorithms were able to solve that problem very efficiently.…”
Section: Algorithms For the Column Subset Selection Problemmentioning
confidence: 99%
“…At present, for the identification of relevant metrics and the removal of irrelevant metrics, three different methods can be employed, namely the filter [22,23], wrapper [24,25], and hybrid models [26,27]. The filter model relies on general characteristics of the data to evaluate and select metrics subsets without involving prediction algorithm.…”
Section: Introductionmentioning
confidence: 99%