2022
DOI: 10.1109/tevc.2021.3117116
|View full text |Cite
|
Sign up to set email alerts
|

Static and Dynamic Multimodal Optimization by Improved Covariance Matrix Self-Adaptation Evolution Strategy With Repelling Subpopulations

Abstract: The covariance matrix self-adaptation evolution strategy with repelling subpopulations (RS-CMSA-ES) is one of the most successful multimodal optimization methods currently available. However, some of its components may become inefficient in certain situations. This study introduces the second variant of this method, called RS-CMSA-ESII. It improves the adaptation schemes for the normalized taboo distances of the archived solutions and the covariance matrix of the subpopulation, the termination criteria for the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 49 publications
0
6
0
Order By: Relevance
“…( 8), which belong to the source task j and the target task i respectively, can reduce the difference in weights of the rank prediction processes (i.e., H s (xgb,j) and H t (xmgb,j)) of the two solutions in Eq. (9). In this way, our proposed optimization model in CTM can be regarded as minimizing the gap between the learned H s (x) and H t (x).…”
Section: ) Explanation For the Ctm Mechanismmentioning
confidence: 99%
See 1 more Smart Citation
“…( 8), which belong to the source task j and the target task i respectively, can reduce the difference in weights of the rank prediction processes (i.e., H s (xgb,j) and H t (xmgb,j)) of the two solutions in Eq. (9). In this way, our proposed optimization model in CTM can be regarded as minimizing the gap between the learned H s (x) and H t (x).…”
Section: ) Explanation For the Ctm Mechanismmentioning
confidence: 99%
“…The evolutionary process executes repeatedly until a stopping criterion is met. EC algorithms have been successfully applied to complex optimization problems like large-scale [3]- [5], dynamic [6] [7], multimodal [8] [9], multi-/many-objective [10]- [12], expensive [13]- [15], and real-world applications [16]- [20].…”
Section: Introductionmentioning
confidence: 99%
“…Most existing optimization algorithms for UMOPs or MMOPs tends to guide particles to certain locations according to the global and local optimal locations (Figure 3(left)). Such a feature helps UMOPs or MMOPs to let particles converge to one or several locations [25,26]. However, for ROMPs such as the sampling process of LSF described above, continuity of the solutions will be ignored using such a searching pattern, leading to a poor approximation of the LSF.…”
Section: Normal Search Patternmentioning
confidence: 99%
“…The proposed NSPSO is designed to deal with RMOPs such as the finding of LSF which can also be seen as a special multimodal optimization problem. Therefore four-NMMSO [31], RSCMSAESII [25], Multi_AMO [32] and DP-MSCC-ES [33]-are chosen as competitive algorithms for the comparative study. Notably, the RS-CMSA-ESII won the championship in the GECCO 2020 Competition on Niching Methods for Multimodal Optimization.…”
Section: Comparative Analysis Settingsmentioning
confidence: 99%
“…Currently, the ANN-based learning branch has aroused great attention due to the success of deep learning (DL) in various real-world applications [5]- [7]. More significantly, EC algorithms have also made the great pace in research and applications [8]- [13]. EC algorithms were born in the 1960s, when computer scientists designed EC algorithms such as the genetic algorithm (GA) [14] [15], evolution strategy [16], and evolutionary programming [17]- [19] for solving optimization problems.…”
Section: Introductionmentioning
confidence: 99%