2022
DOI: 10.1007/s11831-022-09872-y
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-Based Optimizer (GBO): A Review, Theory, Variants, and Applications

Abstract: This paper introduces a comprehensive survey of a new population-based algorithm so-called gradient-based optimizer (GBO) and analyzes its major features. GBO considers as one of the most effective optimization algorithm where it was utilized in different problems and domains, successfully. This review introduces set of related works of GBO where distributed into; GBO variants, GBO applications, and evaluate the efficiency of GBO compared with other metaheuristic algorithms. Finally, the conclusions concentrat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 37 publications
(8 citation statements)
references
References 132 publications
0
8
0
Order By: Relevance
“…Our results suggest that the interneuron population’s emergent attractors depend primarily on the E-I connectivity and the frequency of the driving excitatory oscillations. There are robust gradients ( Daoud et al, 2023 ) towards maximally stable interneuron phase cluster arrangements along the dimensions of pyramidal frequency and E-I connection probability ( Figures 3 , 4 ).…”
Section: Resultsmentioning
confidence: 99%
“…Our results suggest that the interneuron population’s emergent attractors depend primarily on the E-I connectivity and the frequency of the driving excitatory oscillations. There are robust gradients ( Daoud et al, 2023 ) towards maximally stable interneuron phase cluster arrangements along the dimensions of pyramidal frequency and E-I connection probability ( Figures 3 , 4 ).…”
Section: Resultsmentioning
confidence: 99%
“…Gradient-Based Optimization methods present several merits in optimization. They exhibit notable advantages, particularly their rapid convergence, especially when dealing with smooth and convex objective functions (Daoud et al, 2023). Moreover, their suitability for high-dimensional problems, coupled with their amenable parallelization, renders them highly efficient in resource-rich computational settings.…”
Section: Gradient-based Algorithms Vs Metaheuristic Algorithms In Opt...mentioning
confidence: 99%
“…It was effectively employed to multiple engineering problems, such as static var compensator operation in power systems [41], feature selection [42], [43], parameter identification of photovoltaic models [44], human activity recognition using smartphones [45], proton exchange membrane fuel cell parameter estimation [46], structural optimization [47] and economic dispatch [48]. Despite the GBO methods' age of roughly two years, the researchers developed multiple modifications (i.e., versions) that enabled it to become compatible for tackling various types of issues [49]. Despite the fact that GBO has demonstrated its ability to successfully tackle a variety of challenges.…”
Section: A) Problem Statementmentioning
confidence: 99%