2010
DOI: 10.1016/j.neucom.2010.04.003
|View full text |Cite
|
Sign up to set email alerts
|

A new wrapper feature selection approach using neural network

Abstract: This paper presents a new feature selection (FS) algorithm based on the wrapper approach using neural networks (NNs). The vital aspect of this algorithm is the automatic determination of NN architectures during the FS process. Our algorithm uses a constructive approach involving correlation information in selecting features and determining NN architectures. We call this algorithm as constructive approach for FS (CAFS). The aim of using correlation information in CAFS is to encourage the search strategy for sel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 185 publications
(21 citation statements)
references
References 37 publications
0
20
0
1
Order By: Relevance
“…Wrapper techniques depend on the classification algorithm, which is used to evaluate the subsets of features, but they are more expensive when it comes to computation [44]. Despite this weakness, they often provide better outcomes; wrappers are used widely in many applications [45].…”
Section: Wrapper Methodsmentioning
confidence: 99%
“…Wrapper techniques depend on the classification algorithm, which is used to evaluate the subsets of features, but they are more expensive when it comes to computation [44]. Despite this weakness, they often provide better outcomes; wrappers are used widely in many applications [45].…”
Section: Wrapper Methodsmentioning
confidence: 99%
“…They need to train a predictor (classifier or regression model) to evaluate each feature subset; therefore, they are more precise but slower than the filters. The main challenges in wrapper methods are selecting a proper predictor (Li et al 2009;Maldonado and Weber 2009;Monirul Kabir, Monirul Islam, and Murase 2010;Sánchez-Maroño and Alonso-Betanzos 2011) and how to generate appropriate subsets (Macas et al 2012;Tay and Cao 2001;Vignolo, Milone, and Scharcanski 2013). Finally, hybrid methods use both filter and wrapper evaluators simultaneously (Bermejo et al 2012;Bermejo, Gámez, and Puerta 2011;Gheyas and Smith 2010;Ruiz et al 2012).…”
Section: Feature Selectionmentioning
confidence: 99%
“…However, wrappers are prone to overfitting and can be computationally expensive [10]. Hybrid and ensemble methods integrate filters and wrappers alike, thereby benefiting from their complementary approaches [10], [14]. Three of the most common feature selection techniques-apart from the ones used in this report-are mutual information (MI) [15], recursive feature elimination [16], and analysis of variance (ANOVA) tests.…”
Section: Introductionmentioning
confidence: 99%