2020 IEEE International Students' Conference on Electrical,Electronics and Computer Science (SCEECS) 2020
DOI: 10.1109/sceecs48394.2020.189
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Techniques and its Importance in Machine Learning: A Survey

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…The significance of feature selection is multifaceted, reducing computational costs, identifying irrelevant features, and aiding the whole generation of well-classified models [8]. Focusing on the most relevant features also eliminates noise and redundancy, thereby improving the performance and accuracy while also reducing the dimensionality and optimizing the overall ML approaches in diverse fields [9,10].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The significance of feature selection is multifaceted, reducing computational costs, identifying irrelevant features, and aiding the whole generation of well-classified models [8]. Focusing on the most relevant features also eliminates noise and redundancy, thereby improving the performance and accuracy while also reducing the dimensionality and optimizing the overall ML approaches in diverse fields [9,10].…”
Section: Introductionmentioning
confidence: 99%
“…With the maturation of technology and the concurrent advancement in ML methodologies, a trend toward more intricate and profound models becomes evident. Concomitant with these developments is the emergence of new paradigms in feature selection, as researchers are now endeavoring to incorporate feature selection techniques into deep learning models, underscoring the burgeoning significance of this approach even within the context of contemporary, high-dimensionality, and complex models [8][9][10][11].…”
Section: Introductionmentioning
confidence: 99%
“…Wrapper techniques select features by exploring various feature subsets and evaluating model performance; this process is independent of the model training. Embedded approaches are integrated with certain machine learning algorithms, including random forest (RF) and support vector machine (SVM), often have inherent evaluation metrics to select optimal features (Thomas and Gupta, 2020). Due to their consideration of training samples and feature interdependencies, embedded approaches typically yield superior model performance compared with other feature selection methods (Kumar, 2014;Thomas and Gupta, 2020).…”
mentioning
confidence: 99%
“…Embedded approaches are integrated with certain machine learning algorithms, including random forest (RF) and support vector machine (SVM), often have inherent evaluation metrics to select optimal features (Thomas and Gupta, 2020). Due to their consideration of training samples and feature interdependencies, embedded approaches typically yield superior model performance compared with other feature selection methods (Kumar, 2014;Thomas and Gupta, 2020). Algorithms frequently used in the modeling stage include partial least squares regression (PLSR) (Knox et al, 2015), ensemble learning (Carranza, 2015;Tan et al, 2020;Lin et al, 2022), SVM (Iglesias, 2020;Chatterjee et al, 2022), artificial neural networks (ANN) (Saikia et al, 2020), and deep learning (Zhang T. et al, 2023).…”
mentioning
confidence: 99%
“…However, the viewed literature work used Statistical tools like STATA and Epi Info, and Microsoft excel (Gitige et al, 2021;Mwango et al, 2020), for analyzing the feature importance. This study used machine learning techniques to avoid ignoring of the nonlinear features (Gupta, 2020).…”
Section: Discussionmentioning
confidence: 99%