2020
DOI: 10.48550/arxiv.2007.14920
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fibonacci and k-Subsecting Recursive Feature Elimination

Abstract: Feature selection is a data mining task with the potential of speeding up classification algorithms, enhancing model comprehensibility, and improving learning accuracy. However, finding a subset of features that is optimal in terms of predictive accuracy is usually computationally intractable. Out of several heuristic approaches to dealing with this problem, the Recursive Feature Elimination (RFE) algorithm has received considerable interest from data mining practitioners. In this paper, we propose two novel a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…Recursive Feature Elimination (RFE): RFE recursively eliminates the least important features until a specified number remains. Advantages: easy to implement; disadvantages: sensitive to parameter settings (Brzezinski, 2020). Gradient Boosting Feature Selection (GBFS): GBFS selects features based on their contribution to the gradient boosting framework.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Recursive Feature Elimination (RFE): RFE recursively eliminates the least important features until a specified number remains. Advantages: easy to implement; disadvantages: sensitive to parameter settings (Brzezinski, 2020). Gradient Boosting Feature Selection (GBFS): GBFS selects features based on their contribution to the gradient boosting framework.…”
Section: Feature Selectionmentioning
confidence: 99%