2022
DOI: 10.3390/a15040122
|View full text |Cite
|
Sign up to set email alerts
|

Neuroevolution for Parameter Adaptation in Differential Evolution

Abstract: Parameter adaptation is one of the key research fields in the area of evolutionary computation. In this study, the application of neuroevolution of augmented topologies to design efficient parameter adaptation techniques for differential evolution is considered. The artificial neural networks in this study are used for setting the scaling factor and crossover rate values based on the available information about the algorithm performance and previous successful values. The training is performed on a set of benc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 68 publications
0
3
0
Order By: Relevance
“…In particular, these studies have shown that GP is capable of finding high-performing adaptation techniques, which differ from a widely used success-history adaptation [3]. Instead of GP, the neuroevolutionary approach can be used, as shown in [16]. Although GP and neuroevolution are useful methods in this case, some alternative can be considered, for example in [8] the combination of Taylor series and EGO algorithm was used to find parameter adaptation schemes for scaling factor F, crossover rate Cr and population size N in NL-SHADE-RSP algorithm.…”
Section: Hyper-heuristics For Automatic Design Of Algorithmsmentioning
confidence: 99%
“…In particular, these studies have shown that GP is capable of finding high-performing adaptation techniques, which differ from a widely used success-history adaptation [3]. Instead of GP, the neuroevolutionary approach can be used, as shown in [16]. Although GP and neuroevolution are useful methods in this case, some alternative can be considered, for example in [8] the combination of Taylor series and EGO algorithm was used to find parameter adaptation schemes for scaling factor F, crossover rate Cr and population size N in NL-SHADE-RSP algorithm.…”
Section: Hyper-heuristics For Automatic Design Of Algorithmsmentioning
confidence: 99%
“…The GP-designed equations, which depended on the current resource, success rate and values from success history adaptation, and different search ranges were considered for the scaling factor (F), including negative values. In [32], a similar approach was utilized; instead of genetic programming, the neuroevolution of augmented topologies (NEAT) algorithm was used. Such approaches can be classified as automated design of algorithms (ADA) or genetic improvement (GI) [33].…”
Section: Related Workmentioning
confidence: 99%
“…To keep away from the manual tuning of parameters, researchers have suggested adaptive/ self-adaptive setting of parameters, where the control parameters are changed vigorously based on the response of the search space in place of taking a fixed value. A few works in the improvement of adaptive/self-adaptive methods of control parameters values are suggested in [4][5][6][7][8][9][10][11][12].…”
Section: Introductionmentioning
confidence: 99%