2022
DOI: 10.1007/s00521-022-06921-2
|View full text |Cite
|
Sign up to set email alerts
|

Large scale salp-based grey wolf optimization for feature selection and global optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
18
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 38 publications
(19 citation statements)
references
References 47 publications
0
18
0
1
Order By: Relevance
“…The goal of feature selection is to eliminate these unnecessary features from the model before training to increase the model’s success. Many redundant or irrelevant features increase the computational burden (Tiwari and Chaturvedi 2022 ), resulting in the “curse of dimensionality.” Feature selection (FS) helps select the optimal classifier by choosing the most relevant features to decision-making (Qaraad et al 2022 ). The number of possible solutions exponentially increases as the number of features increases in FS, which is an NP-hard combinatorial problem.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The goal of feature selection is to eliminate these unnecessary features from the model before training to increase the model’s success. Many redundant or irrelevant features increase the computational burden (Tiwari and Chaturvedi 2022 ), resulting in the “curse of dimensionality.” Feature selection (FS) helps select the optimal classifier by choosing the most relevant features to decision-making (Qaraad et al 2022 ). The number of possible solutions exponentially increases as the number of features increases in FS, which is an NP-hard combinatorial problem.…”
Section: Methodsmentioning
confidence: 99%
“…Feature selection methods fall into three main categories: (1) filters, (2) wrappers, and (3) hybrid or embedded methods (Hancer et al 2022 ; Tiwari and Chaturvedi 2022 ). In conjunction with statistical data analysis metrics such as correlation and distance, filter-based methods such as principal component analysis, F-scores, and information gains identify subsets of features in the data (Qaraad et al 2022 ). Despite their speed, the methods do not depend on the learning algorithm.…”
Section: Methodsmentioning
confidence: 99%
“…The embedded methods embed feature selection into the learner training process. The filter techniques select relevant features purely based on the inherent relationships among the features, without requiring the integration of any learning method [ 27 , 28 ]. The Wrapper methods combine a search strategy with a learning method to identify the optimal subset of features for classification performance by utilizing a subset of features obtained through the search strategy to train a classifier [ 29 , 30 ].…”
Section: Related Workmentioning
confidence: 99%
“…To reduce the evaluation cost, a rapid evaluation approach was also developed. To overcome the problems of low precision and global optimization, the convergence speed is accelerated in high-dimensional data where Salp Swarm Optimization (SSO) is combined with GWO (Qaraad et al, 2022). Initially, the leader's position in the chain population is updated using SSO's intensification capabilities.…”
Section: Related Workmentioning
confidence: 99%