2023
DOI: 10.3390/math11081854
|View full text |Cite
|
Sign up to set email alerts
|

Improved Beluga Whale Optimization for Solving the Simulation Optimization Problems with Stochastic Constraints

Abstract: Simulation optimization problems with stochastic constraints are optimization problems with deterministic cost functions subject to stochastic constraints. Solving the considered problem by traditional optimization approaches is time-consuming if the search space is large. In this work, an approach integration of beluga whale optimization and ordinal optimization is presented to resolve the considered problem in a relatively short time frame. The proposed approach is composed of three levels: emulator, diversi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 43 publications
0
5
0
Order By: Relevance
“…Moving on, swarm intelligence (SI) algorithms such as starling murmuration optimizer (SMO), golden jackal optimization (GJO), white shark optimizer (WSO), dandelion optimizer (DO), search in forest optimizer (SIFO), snake optimizer (SO), and beluga whale optimization (BWO) are broadly applied in various sectors and proven to perform better as compared to conventional optimization methods. Out of these optimizers, BWO is a trending nature inspired system that mimics the attacking and feeding behaviors of beluga whales [ 116 ]. The advantages of BWO can provide higher stability, search ability, convergence rate and speed, but lack in terms of premature convergence and risk of being trapped in local optimum.…”
Section: Research Gap and Recommendationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Moving on, swarm intelligence (SI) algorithms such as starling murmuration optimizer (SMO), golden jackal optimization (GJO), white shark optimizer (WSO), dandelion optimizer (DO), search in forest optimizer (SIFO), snake optimizer (SO), and beluga whale optimization (BWO) are broadly applied in various sectors and proven to perform better as compared to conventional optimization methods. Out of these optimizers, BWO is a trending nature inspired system that mimics the attacking and feeding behaviors of beluga whales [ 116 ]. The advantages of BWO can provide higher stability, search ability, convergence rate and speed, but lack in terms of premature convergence and risk of being trapped in local optimum.…”
Section: Research Gap and Recommendationsmentioning
confidence: 99%
“…The advantages of BWO can provide higher stability, search ability, convergence rate and speed, but lack in terms of premature convergence and risk of being trapped in local optimum. Recently, Horng and Lin [ 116 ] came out with a improved BWO or IBWO, with improvement in the learning approach, acceleration of searching process, and variety and consistency of chosen candidates in order to provide a much reliable optimization process.…”
Section: Research Gap and Recommendationsmentioning
confidence: 99%
“…By referring to the solutions of other heuristic optimization algorithms to solve engineering problems, and improving the way of evolution of heuristic algorithms, an applicable multi-threading technology can be developed based on the characteristics of the optimizer [31], and the search efficiency can be improved by optimizing the population of candidate solutions [32][33][34]. The binary scheme selector [35], the optimization strategy selector [36][37][38], and the dynamic parameter selector [36,37,39] can be used.…”
Section: Machine Learning Inversionmentioning
confidence: 99%
“…BWO has proved its efficiency in solving many optimization problems, such as engineering design problems (Jia et al 2023 ), feature selection optimization problems (Gao et al 2023 ), and simulation optimization problems with stochastic constraints (Horng and Lin 2023 ). Additionally, it has shown great performance in finding the optimal hyperparameter values of the VGG deep convolutional neural network as discussed in Deepika and Kuchibhotla’s ( 2024 ) study, the optimal hyperparameter values of the DeepLabv3-based semantic segmentation architecture as discussed in Anilkumar and Venugopal’s ( 2023 ) study, and the optimal hyperparameter values of the convolutional bidirectional long short-term memory with an autoencoder model as discussed in Asiri et al’s ( 2024 ) study.…”
Section: Introductionmentioning
confidence: 99%