2023
DOI: 10.1016/j.sysarc.2023.102871
|View full text |Cite
|
Sign up to set email alerts
|

MEALPY: An open-source library for latest meta-heuristic algorithms in Python

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 94 publications
(9 citation statements)
references
References 120 publications
0
9
0
Order By: Relevance
“…All 20 metaheuristic algorithms are used to optimize the weights for the ELM network. Note that the implemented codes for all metaheuristic algorithms are available at 34 .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…All 20 metaheuristic algorithms are used to optimize the weights for the ELM network. Note that the implemented codes for all metaheuristic algorithms are available at 34 .…”
Section: Methodsmentioning
confidence: 99%
“…To address these problems, in this paper, we introduce three state-of-the-art mathematical-inspired metaheuristic algorithms for training the ELM network: Pareto-like sequential sampling (PSS), weighted mean of vectors (INFO), and the Runge–Kutta optimizer (RUN). The rationale for adopting these three recommended algorithms derives from the theoretical idea of the No-Free Lunch theorem 34 , which states that no single algorithm outperforms all other algorithms on all problems. Thus, proposing three algorithms is expected to yield superior effectiveness and performance capabilities.…”
Section: Introductionmentioning
confidence: 99%
“…During the experiment, all metaheuristic optimization algorithms to be compared were programmed and implemented using the open source Python library MEALPY 2.4.1 [36] to ensure the consistent execution of different optimization strategies and effective comparison of the results produced.…”
Section: Model Performance Evaluationmentioning
confidence: 99%
“…SCA-BP [36] 0 The stability of different models was comparatively analyzed in an all-round way using box plots showing the results of 17 models trained on the test set for 10 times (Figure 8). The results show that, in terms of MAE, the SAEO-BP neural network has the highest stability, followed by the IAEO-BP and VCS-BP neural networks; in terms of the other three performance indicators, the SAEO-BP has a high stability, second only to the IAEO-BP neural network.…”
Section: Comparative Analysis Of the Performance Of The Saeo-bp Neura...mentioning
confidence: 99%
“…Additionally [6], selected 20 benchmark functions from CEC2014 and CEC2015 to assess the Improved Sea Lion Optimization algorithm against other techniques. Moreover [7], proposed a specialized library for metaheuristic algorithms and utilized functions from CEC2017 to evaluate the performance of these algorithms. Notably, recent metaheuristic algorithms, such as the Q-learning based Vegetation Evolution algorithm [8], have incorporated Opfunu library functions for testing on the CEC2020 function set, as well as engineering problems and WSN coverage optimization problems.…”
Section: Introductionmentioning
confidence: 99%