2019
DOI: 10.1016/j.asoc.2019.105521
|View full text |Cite
|
Sign up to set email alerts
|

Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
53
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 129 publications
(53 citation statements)
references
References 87 publications
0
53
0
Order By: Relevance
“…In order to test the performance of the proposed GBHHO, some representative algorithms were selected as comparison algorithms. These algorithms include the classic original algorithms MFO, FA [74], GWO [46], HHO, WOA and some improved algorithms ACWOA [75], OBLGWO [47], OBSCA [76], SCADE [77]. The detailed parameter values are reported in Table 2.…”
Section: Experimental Results a Benchmark Function Validationmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to test the performance of the proposed GBHHO, some representative algorithms were selected as comparison algorithms. These algorithms include the classic original algorithms MFO, FA [74], GWO [46], HHO, WOA and some improved algorithms ACWOA [75], OBLGWO [47], OBSCA [76], SCADE [77]. The detailed parameter values are reported in Table 2.…”
Section: Experimental Results a Benchmark Function Validationmentioning
confidence: 99%
“…The detailed parameter values are reported in Table 2. The CEC2014 [47] benchmark functions were selected as test functions for the experiment. We kept the required conditions for a fair comparison.…”
Section: Experimental Results a Benchmark Function Validationmentioning
confidence: 99%
See 1 more Smart Citation
“…Aljarah et al proposed a hybrid algorithm, which is a combination of GWO and tabu search for solving clustering‐based analysis problems. Heidari et al proposed oppositional learning‐based GWO for solving two real‐world problems of key parameter tuning of kernel extreme learning machine, and the proposed algorithm is compared with previous versions of GWO in terms of solution quality and convergence speed. Jayabarathi et al proposed a hybrid grey wolf optimizer, which is implemented by adding differential evolution operators for solving economic load dispatch problem in power systems.…”
Section: Grey Wolf Optimizermentioning
confidence: 99%
“…Based on Gaussian mutation and a chaotic local search that are employed to increase the population diversity of MFO and the flame updating process of MFO for better exploiting the locality of the solutions, respectively, the proposed CLSGMFO approach [37] is used to perform the function optimizations and is combined with a hybrid kernel extreme learning machine (KELM) model for financial prediction. Based on oppositionbased learning (OBL) and the drawbacks of GWO, OBLGWO [38] is proposed to tune the parameters of KELM for dealing with two real-world problems: second major selection (SMS) problem and thyroid cancer diagnosis (TCD) problem.…”
Section: Introductionmentioning
confidence: 99%