2020
DOI: 10.1016/j.swevo.2020.100696
|View full text |Cite
|
Sign up to set email alerts
|

Selective-candidate framework with similarity selection rule for evolutionary optimization

Abstract: Achieving better exploitation and exploration capabilities (EEC) have always been an important yet challenging issue in evolutionary optimization algorithm (EOA) design. The difficulties lie in obtaining a good balance in EEC, which is cooperatively determined by operations and parameters in an EOA. When deficiencies in exploitation or exploration are observed, most existing works only consider supplementing it, either by designing new operations or by altering the parameters. Unfortunately, when different sit… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 56 publications
1
3
0
Order By: Relevance
“…As reported in many published applications of DL in genomic predictions ( Bellot et al 2018 ; Abdollahi-Arpanahi et al 2020 ; Zingaretti et al 2020 ), we observed that retraining of a certain DL model with the same hyperparameter configuration and the same dataset produced slightly different predictions. This forced us to consider the variation in the predictive performance under the retraining in DE and post-DE model selection (see the methods section).…”
Section: Methodssupporting
confidence: 63%
See 1 more Smart Citation
“…As reported in many published applications of DL in genomic predictions ( Bellot et al 2018 ; Abdollahi-Arpanahi et al 2020 ; Zingaretti et al 2020 ), we observed that retraining of a certain DL model with the same hyperparameter configuration and the same dataset produced slightly different predictions. This forced us to consider the variation in the predictive performance under the retraining in DE and post-DE model selection (see the methods section).…”
Section: Methodssupporting
confidence: 63%
“…For the real dataset, optimized MLPs fixed the nonlinear activation function “sigmoid.” We argue that the selected nonlinear activation reflects the increased complexity of polygenic inheritance in real datasets. Regarding this perspective, Zingaretti et al. (2020) indicated that in a real dataset, DL could model complex relationships by employing nonlinear functions, and they also observed that sigmoid-like hyperbolic tangent (“tanh”) was a safer choice overall.…”
Section: Methodsmentioning
confidence: 99%
“…Kim and Lee (2019) reported that deep learning models with different hyperparameters could have the same predictive performance, which indicated that the best solution may not be unique. Zhang et al (2020) also indicated that superior solutions would prefer the closest candidates in evolutionary optimization algorithms. Therefore, we argue that DE evolves a population to where candidate solutions are increasingly similar to each other.…”
Section: Optimization Runtime Profilesmentioning
confidence: 99%
“…So, it is a crucial problem for this approach, how to design an effective individual selection mechanism to select the appropriate individual from the candidate pool. Aiming at the multi-strategy method and its important individual selection mechanism, we pay attention to an advanced DE optimization framework, namely, selective-candidate framework with similarity selection rule (SCSS), which was proposed by Zhang et al [28] in 2020. It consists of multiple candidates generation (MCG) and similarity selection (SS) rule.…”
Section: Nomenclature Descriptionsmentioning
confidence: 99%