2021
DOI: 10.32890/jict2021.20.2.4
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Grey Wolf Optimization-Based Learning of Artificial Neural Network for Medical Data Classification

Abstract: Grey wolf optimization (GWO) is a recent and popular swarm-based metaheuristic approach. It has been used in numerous fields such as numerical optimization, engineering problems, and machine learning. The different variants of GWO have been developed in the last 5 years for solving optimization problems in diverse fields. Like other metaheuristic algorithms, GWO also suffers from local optima and slow convergence problems, resulted in degraded performance. An adequate equilibrium among exploration and exploita… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 59 publications
0
13
0
Order By: Relevance
“…In this study, the weights and biases are optimized with a variant of grey wolf optimization (GWO) [54]. This variant is named inertia motivated grey wolf optimization (IMGWO) and proposed by Kumar et al in 2021 [22].IMGWO was proposed to increase the exploration and exploitation capabilities of GWO since it is usually entrapped in local minima and slow convergence during the later part of evolution [55].IMGWO is explained in Algorithm1.…”
Section: 2methodologymentioning
confidence: 99%
See 3 more Smart Citations
“…In this study, the weights and biases are optimized with a variant of grey wolf optimization (GWO) [54]. This variant is named inertia motivated grey wolf optimization (IMGWO) and proposed by Kumar et al in 2021 [22].IMGWO was proposed to increase the exploration and exploitation capabilities of GWO since it is usually entrapped in local minima and slow convergence during the later part of evolution [55].IMGWO is explained in Algorithm1.…”
Section: 2methodologymentioning
confidence: 99%
“…ANNs performed better as compared to other classifiers while diagnosing a heart problem [17], [18] however, ANN often entrapped in local minima and poor convergence [19], [20]. The poor convergence and local minima entrapment may be handled by hybridizing ANN with suitable metaheuristic optimization methods [21], [22].In this paper, classification via ANN-IMGWO model is performed preceded by feature selection on a real-world heart diseases dataset. Other contemporary metaheuristic optimization methods viz.…”
Section: → ∑ ∑ (1)mentioning
confidence: 99%
See 2 more Smart Citations
“…In the network training process, for the output results of each layer, the degree of classification error needs to be measured by loss function calculation. e weight of each layer is updated by gradient descent [13].…”
Section: Dissemination Model Based On Deep Residual Networkmentioning
confidence: 99%