2022
DOI: 10.3390/make4010007
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Framework for Fast Feature Selection Based on Multi-Stage Correlation Measures

Abstract: Datasets with thousands of features represent a challenge for many of the existing learning methods because of the well known curse of dimensionality. Not only that, but the presence of irrelevant and redundant features on any dataset can degrade the performance of any model where training and inference is attempted. In addition, in large datasets, the manual management of features tends to be impractical. Therefore, the increasing interest of developing frameworks for the automatic discovery and removal of us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…However, in the case of a small number of training samples, inaccurate estimates of mutual information may appear, and the method is biased towards features with a large number of different values due to the use of this metric. In [30], a feature selection framework for large datasets was proposed based on a cascade of methods capable of detecting nonlinear relationships between two features and designed to achieve a balance between accuracy and speed.…”
Section: Feature Selectionmentioning
confidence: 99%
“…However, in the case of a small number of training samples, inaccurate estimates of mutual information may appear, and the method is biased towards features with a large number of different values due to the use of this metric. In [30], a feature selection framework for large datasets was proposed based on a cascade of methods capable of detecting nonlinear relationships between two features and designed to achieve a balance between accuracy and speed.…”
Section: Feature Selectionmentioning
confidence: 99%