Encyclopedia of Complexity and Systems Science 2009
DOI: 10.1007/978-0-387-30440-3_317
|View full text |Cite
|
Sign up to set email alerts
|

Manipulating Data and Dimension Reduction Methods: Feature Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(43 citation statements)
references
References 18 publications
0
43
0
Order By: Relevance
“…The second style is called the wrapper method, which directly connects the selection of the feature subset to the performance of some specific classifier, such as an SVM, decision tree, or random forest. The filtering-based approach is more universal and is independent of classifiers, whereas the wrapper-based approach tends to perform better in practice [67,69]. Both styles rely on efficient optimization algorithms.…”
Section: Curse Of Dimensionalitymentioning
confidence: 99%
“…The second style is called the wrapper method, which directly connects the selection of the feature subset to the performance of some specific classifier, such as an SVM, decision tree, or random forest. The filtering-based approach is more universal and is independent of classifiers, whereas the wrapper-based approach tends to perform better in practice [67,69]. Both styles rely on efficient optimization algorithms.…”
Section: Curse Of Dimensionalitymentioning
confidence: 99%
“…This approach which is called Wrapper approach has been used with most of the classification algorithms such as decision tree (DT), support vector machines (SVMs), Naïve Bayes (NB), K-nearest neighbor (KNN), artificial neural networks (ANNs), and linear discriminant analysis (LDA), have been applied to wrappers for feature selection [13], [14], [15].…”
Section: Related Work 21 Gp and Feature Extractionmentioning
confidence: 99%
“…Filter algorithms are often computationally less expensive and more general than wrapper algorithms. However, filters ignore the performance of the selected features on a classification algorithm, whereas wrappers evaluate the feature subsets based on the classification performance, which usually results in better performance achieved by wrappers than filters for a particular classification algorithm [1], [7], [8]. Note that some researchers categorize feature selection methods into three groups: 1) wrapper; 2) embedded; and 3) filter approaches [7], [8].…”
Section: IImentioning
confidence: 99%
“…There are very few feature selection methods that use an exhaustive search [1], [7], [8]. This is because even when the number of features is relatively small (e.g., 50), in many situations, such methods are computationally too expensive to perform.…”
Section: Existing Work On Feature Selection 1) Search Techniquesmentioning
confidence: 99%
See 1 more Smart Citation