2021
DOI: 10.1155/2021/2555622
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection

Abstract: Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(16 citation statements)
references
References 27 publications
0
16
0
Order By: Relevance
“…A neural network (NN) based on GWO was utilized to categorize the various emotions from the selected features. Kitonyi and Segera [ 133 ] presented a hybridization of a popular metaheuristic optimizer called GWO and gradient descent algorithm to resolve feature selection issues. They first compared the approach with the baseline GWO in twenty-three test functions and developed three binary implementations, and compared the final implementation against two implementations of the binary GWO and binary GWPSO using six medical datasets taken from the UCI repository on the rate of accuracy, the number of features selected subsets, precision, F-measure, and sensitivity metrics.…”
Section: Metaheuristic Algorithms For Multiclass Feature Selectionmentioning
confidence: 99%
“…A neural network (NN) based on GWO was utilized to categorize the various emotions from the selected features. Kitonyi and Segera [ 133 ] presented a hybridization of a popular metaheuristic optimizer called GWO and gradient descent algorithm to resolve feature selection issues. They first compared the approach with the baseline GWO in twenty-three test functions and developed three binary implementations, and compared the final implementation against two implementations of the binary GWO and binary GWPSO using six medical datasets taken from the UCI repository on the rate of accuracy, the number of features selected subsets, precision, F-measure, and sensitivity metrics.…”
Section: Metaheuristic Algorithms For Multiclass Feature Selectionmentioning
confidence: 99%
“…The Grey Wolf Optimizer (GWO) algorithm is used to obtain the optimum CNN architecture to achieve the best classification result. The GWO algorithm is used in solving optimization problems [20][21][22][23] and improving other metaheuristic optimization algorithms [24][25][26][27][28][29]. It is preferred for hyperparameter optimization of the CNN architecture.…”
Section: Structurementioning
confidence: 99%
“…The authors in [ 46 ] presented a hybridization of a popular metaheuristic optimizer called GWO and a gradient descent algorithm which was used to resolve feature selection issues. Similarly, a newly proposed hybridized technique comprised the Extended Binary Cuckoo Search, Genetic Algorithm, and Whale Optimization Algorithm, which aimed to reduce the time required to search a huge database during image retrieval.…”
Section: Related Literaturementioning
confidence: 99%