2021
DOI: 10.1109/access.2021.3112169
|View full text |Cite
|
Sign up to set email alerts
|

Metric and Accuracy Ranked Feature Inclusion: Hybrids of Filter and Wrapper Feature Selection Approaches

Abstract: Feature selection has emerged as a craft, using which we boost the performance of our learning model. Feature or Attribute Selection is a data preprocessing technique, where only the most informative features are considered and given to the predictor. This reduces the computational overhead and improves the correctness of the classifier. Attribute Selection is commonly carried out by applying some filter or by using the performance of the learning model to gauge the quality of the attribute subset. Metric Rank… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(6 citation statements)
references
References 68 publications
0
4
0
Order By: Relevance
“…Genetic Algorithms (GAs) 17 19 use principles of natural evolution to explore the feature space. GAs is particularly useful for high-dimensional datasets but are computationally expensive and can suffer from premature convergence levels 20 22 . PSO simulates the social behavior of birds and fish to explore the feature space.…”
Section: In-depth Review Of Existing Machine Learning Models Used For...mentioning
confidence: 99%
“…Genetic Algorithms (GAs) 17 19 use principles of natural evolution to explore the feature space. GAs is particularly useful for high-dimensional datasets but are computationally expensive and can suffer from premature convergence levels 20 22 . PSO simulates the social behavior of birds and fish to explore the feature space.…”
Section: In-depth Review Of Existing Machine Learning Models Used For...mentioning
confidence: 99%
“…The mutual information value is normalized to [0, 1] to obtain the NMI value N M I(X; Y ), see equation (6).…”
Section: Nmi-based Redundant Feature Filteringmentioning
confidence: 99%
“…The wrapper method [2,3] is mainly based on the training effect of the subsequent machine learning algorithms for the feature selection, which requires many times of training and has a large computational cost. The embedding method [4][5][6] integrates feature selection and model training into one process and achieves feature selection while training, but its parameter setting is complicated and the time complexity is high. Typhoon related data is a kind of spatio-temporal series data.…”
mentioning
confidence: 99%
“…Elimination of in-significant features results in dimensionality reduction. Filter, wrapper, and hybrid methods are conventional feature selection methods [14]. In the high-dimension dataset, certain features have a low correlation with the response variable of the dataset.…”
Section: Feature Selectionmentioning
confidence: 99%