One of the most challenging problems in evolutionary computation is to select from its family of diverse solvers one that performs well on a given problem. is algorithm selection problem is complicated by the fact that di erent phases of the optimization process require di erent search behavior. While this can partly be controlled by the algorithm itself, there exist large di erences between algorithm performance. It can therefore be bene cial to swap the con guration or even the entire algorithm during the run. Long deemed impractical, recent advances in Machine Learning and in exploratory landscape analysis give hope that this dynamic algorithm con guration (dynAC) can eventually be solved by automatically trained con guration schedules. With this work we aim at promoting research on dynAC, by introducing a simpler variant that focuses only on switching between di erent algorithms, not congurations. Using the rich data from the Black Box Optimization Benchmark (BBOB) platform, we show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in signicant performance gains. We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these.
CCS CONCEPTS• eory of computation → Bio-inspired optimization; Online algorithms;arXiv:2006.06586v1 [cs.NE]