2014
DOI: 10.1016/j.patcog.2013.10.009
|View full text |Cite
|
Sign up to set email alerts
|

Efficient feature size reduction via predictive forward selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
30
0
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 59 publications
(32 citation statements)
references
References 20 publications
0
30
0
2
Order By: Relevance
“…The main assumption during feature selection is that the initial data set contains variables which are redundant, strongly correlated with one another or irrelevant and, therefore, can be deleted without information losses [10,11].…”
Section: Feature Selectionmentioning
confidence: 99%
“…The main assumption during feature selection is that the initial data set contains variables which are redundant, strongly correlated with one another or irrelevant and, therefore, can be deleted without information losses [10,11].…”
Section: Feature Selectionmentioning
confidence: 99%
“…Recent work presents algorithms faster than SFS [3], [4], [5], [6], [7]. These methods commonly try to reduce the number of wrapper evaluations.…”
Section: Hybrid Ranking and Wrapper Feature Selection Algorithmmentioning
confidence: 99%
“…Two of the cited approaches merge filter and wrapper techniques [3], [5]. Another uses predictions based on meta-features [6]. The algorithm presented in [7] uses a stochastic approach, and the number of wrapper evaluations changes according to its parameters.…”
Section: Hybrid Ranking and Wrapper Feature Selection Algorithmmentioning
confidence: 99%
See 2 more Smart Citations