2021
DOI: 10.1016/j.eswa.2021.114575
|View full text |Cite
|
Sign up to set email alerts
|

A modified equilibrium optimizer using opposition-based learning and novel update rules

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 49 publications
(21 citation statements)
references
References 72 publications
0
21
0
Order By: Relevance
“…This section of the paper comprehensively deliberates the results obtained by the proposed I-AVO algorithm and other selected algorithms, such as AVO, SMA, MPA, ( 39) [83], Chaotic GBO [64], Chaotic Jaya [108], and OBL-GWO. All the selected algorithms are combined with the NR method to get a fair result for performance comparison.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This section of the paper comprehensively deliberates the results obtained by the proposed I-AVO algorithm and other selected algorithms, such as AVO, SMA, MPA, ( 39) [83], Chaotic GBO [64], Chaotic Jaya [108], and OBL-GWO. All the selected algorithms are combined with the NR method to get a fair result for performance comparison.…”
Section: Resultsmentioning
confidence: 99%
“…As a result, improving these methods is difficult because it necessitates more exertion in the optimization process, computing, statistics, and noise impacts. The chaotic drifted JAYA algorithm [79], improved JAYA [80], linear population size reduction based on success history adaptive DE (LSHADE) [46], GWO-PSO [81], GWO-Cuckoo search [82], chaotic GBO (CGBO) [64], Opposition-Based Learning EO (OBLEO) algorithm [83], OBLGWO [84], and others are some examples. The PV parameter estimation optimization problem is a hot topic right now.…”
Section: Introductionmentioning
confidence: 99%
“…With the help of optimization techniques, a large number of problems encountered in different applied disciplines could be solved in a more efficient, accurate, and real-time way [5,6]. However, with the increasing complexity of global optimization problems nowadays, conventional mathematical methods based on gradient information are challenged by high-dimensional, suboptimal regions, and large-scale search ranges that cannot adapt to the real requirements [7,8]. The development of more effective tools to settle these complex NP-hard problems is an indivisible research hotspot.…”
Section: Introductionmentioning
confidence: 99%
“…With their own distinctive characteristics, these metaheuristics are commonly used in a variety of computing science fields, such as fault diagnosis [42], feature selection [43], engineering optimization [44], path planning [45], and parameters identification [46]. Nevertheless, it has been shown that the most basic algorithms still have the limitations of slow convergence, poor accuracy, and ease of getting trapped into the local optimum [7,15] in several applications. The non-free lunch (NFL) theorem indicates that there is no general algorithm that could be appropriate for all optimization tasks [47].…”
Section: Introductionmentioning
confidence: 99%
“…The combination examples of metaheuristic algorithms with the OBL scheme can be found in the literature. Few of such enhanced metaheuristic algorithms can be listed as particle swarm optimization (Wang et al, 2011), firefly algorithm (Yu et al, 2015), grasshopper optimization algorithm (Ewees et al, 2018), salp swarm algorithm (Tubishat et al, 2020), crow search algorithm (Shekhawat and Saxena, 2020), equilibrium optimization (Fan et al, 2021), grey wolf optimization (Yu et al, 2021), and artificial electric field algorithm (Demirören et al, 2021). Those listed studies have shown that the combination of the OBL scheme with metaheuristic algorithms provides better convergence rate and exploration capabilities of the original forms.…”
Section: Introductionmentioning
confidence: 99%