2016
DOI: 10.1109/tevc.2015.2504420
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Evolutionary Computation Approaches to Feature Selection

Abstract: Link to publication on Research at Birmingham portal General rights Unless a licence is specified above, all rights (including copyright and moral rights) in this document are retained by the authors and/or the copyright holders. The express permission of the copyright holder must be obtained for any use of this material other than for purposes permitted by law. • Users may freely distribute the URL that is used to identify this publication. • Users may download and/or print one copy of the publication from th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
635
0
7

Year Published

2016
2016
2022
2022

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 1,471 publications
(707 citation statements)
references
References 233 publications
0
635
0
7
Order By: Relevance
“…Chen et al have applied ant colony optimization (ACO) together with rough set theory for feature selection [8]. In particular, PSO, as a popular metaheuristics, has also been widely adopted for feature selection [40]. Chuang et al have developed an improved binary PSO algorithm for feature selection using gene expression data [12].…”
Section: Metaheuristics For Feature Selectionmentioning
confidence: 99%
“…Chen et al have applied ant colony optimization (ACO) together with rough set theory for feature selection [8]. In particular, PSO, as a popular metaheuristics, has also been widely adopted for feature selection [40]. Chuang et al have developed an improved binary PSO algorithm for feature selection using gene expression data [12].…”
Section: Metaheuristics For Feature Selectionmentioning
confidence: 99%
“…As a notable multilabel feature wrapper study, Zhang et al [12] proposed a multilabel feature selection method based on a genetic algorithm (GA), which is the most common choice in evolutionary feature wrapper studies [28]. Specifically, their method combined instance-and label-based evaluation metrics [39] as a fitness function to determine label dependency.…”
Section: Related Workmentioning
confidence: 99%
“…This typically results in better classification accuracy [11,12]. For this reason, we focus on a multilabel feature wrapper based on an evolutionary search process [28].…”
Section: Introductionmentioning
confidence: 99%
“…Both styles rely on efficient optimization algorithms. Some commonly used algorithms are random search and greedy search (such as step backward or step forward) and evolutionary search (such as genetic algorithms or particle algorithms) [70,71].…”
Section: Curse Of Dimensionalitymentioning
confidence: 99%