In the field of evolutionary computation, one of the most challenging topics is algorithm selection. Knowing which heuristics to use for which optimization problem is key to obtaining high-quality solutions. We aim to extend this research topic by taking a first step towards a selection method for adaptive CMA-ES algorithms. We build upon the theoretical work done by van Rijn et al. [PPSN'18], in which the potential of switching between different CMA-ES variants was quantified in the context of a modular CMA-ES framework.We demonstrate in this work that their proposed approach is not very reliable, in that implementing the suggested adaptive configurations does not yield the predicted performance gains. We propose a revised approach, which results in a more robust fit between predicted and actual performance. The adaptive CMA-ES approach obtains performance gains on 18 out of 24 tested functions of the BBOB benchmark, with stable advantages of up to 23%. An analysis of module activation indicates which modules are most crucial for the different phases of optimizing each of the 24 benchmark problems. The module activation also suggests that additional gains are possible when including the (B)IPOP modules, which we have excluded for this present work.The adaptive CMA-ES configurations we use were implemented in the modular CMA-ES framework introduced in [vRWvLB16], which is freely available at [vR18].This framework implements 11 different modules. Of these 11 modules, 9 are binary and 2 are ternary, allowing for a combined total of 4,608 different configurations. The full list of available modules is shown in Table 1.
One of the most challenging problems in evolutionary computation is to select from its family of diverse solvers one that performs well on a given problem. is algorithm selection problem is complicated by the fact that di erent phases of the optimization process require di erent search behavior. While this can partly be controlled by the algorithm itself, there exist large di erences between algorithm performance. It can therefore be bene cial to swap the con guration or even the entire algorithm during the run. Long deemed impractical, recent advances in Machine Learning and in exploratory landscape analysis give hope that this dynamic algorithm con guration (dynAC) can eventually be solved by automatically trained con guration schedules. With this work we aim at promoting research on dynAC, by introducing a simpler variant that focuses only on switching between di erent algorithms, not congurations. Using the rich data from the Black Box Optimization Benchmark (BBOB) platform, we show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in signicant performance gains. We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these. CCS CONCEPTS• eory of computation → Bio-inspired optimization; Online algorithms;arXiv:2006.06586v1 [cs.NE]
Variational quantum algorithms (VQAs) offer a promising path toward using near-term quantum hardware for applications in academic and industrial research. These algorithms aim to find approximate solutions to quantum problems by optimizing a parametrized quantum circuit using a classical optimization algorithm. A successful VQA requires fast and reliable classical optimization algorithms. Understanding and optimizing how off-theshelf optimization methods perform in this context is important for the future of the field. In this work, we study the performance of four commonly used gradient-free optimization methods [sequential least-squares quadratic programming, constrained optimization by linear approximations, the covariance matrix adaptation evolutionary strategy (CMA-ES), and the simultaneous perturbation stochastic approximation (SPSA)] to find ground-state energies of a range of small chemistry and material science problems. We test a telescoping sampling scheme (where the accuracy of the cost-function estimate provided to the optimizer is increased as the optimization converges) for all methods, demonstrating mixed results across our range of optimizers and problems chosen. We further hyperparameter tune two of the four optimizers (CMA-ES and SPSA) across a large range of models and demonstrate that with appropriate hyperparameter tuning, CMA-ES is competitive with and sometimes outperforms SPSA (which is not observed in the absence of hyperparameter tuning). Finally, we investigate the ability of an optimizer to beat the "sampling-noise floor" given by the sampling noise of each cost-function estimate provided to the optimizer. Our results demonstrate the necessity for tailoring and hyperparameter tuning known optimization techniques for inherently noisy variational quantum algorithms and that the variational landscape that one finds in a VQA is highly problem and system dependent. This provides guidance for future implementations of these algorithms in experiments.
Introducing new algorithmic ideas is a key part of the continuous improvement of existing optimization algorithms. However, when introducing a new component into an existing algorithm, assessing its potential benefits is a challenging task. Often, the component is added to a default implementation of the underlying algorithm and compared against a limited set of other variants. This assessment ignores any potential interplay with other algorithmic ideas that share the same base algorithm, which is critical in understanding the exact contributions being made. We explore a more extensive procedure, which uses hyperparameter tuning as a means of assessing the benefits of new algorithmic components. This allows for a more robust analysis by not only focusing on the impact on performance, but also by investigating how this performance is achieved. We implement our suggestion in the context of the Modular CMA-ES framework, which was redesigned and extended to include some new modules and several new options for existing modules, mostly focused on the step-size adaptation method. Our analysis highlights the differences between these new modules, and identifies the situations in which they have the largest contribution. CCS CONCEPTS• Theory of computation → Design and analysis of algorithms; Bio-inspired optimization.
When faced with a specific optimization problem, deciding which algorithm to apply is always a difficult task. Not only is there a vast variety of algorithms to select from, but these algorithms are often controlled by many hyperparameters, which need to be suitably tuned in order to achieve peak performance. Usually, the problem of selecting and configuring the optimization algorithm is addressed sequentially, by first selecting a suitable algorithm and then tuning it for the application at hand. Integrated approaches, commonly known as Combined Algorithm Selection and Hyperparameter (CASH) solvers, have shown promise in several applications. In this work we compare sequential and integrated approaches for selecting and tuning the best out of the 4,608 variants of the modular Covariance Matrix Adaptation Evolution Strategy (CMA-ES). We show that the ranking of these variants depends to a large extent on the quality of the hyperparameters. Sequential approaches are therefore likely to recommend sub-optimal choices. Integrated approaches, in contrast, manage to provide competitive results at much smaller computational cost. We also highlight important differences in the search behavior of two CASH approaches, which build on racing (irace) and on model-based optimization (MIP-EGO), respectively. CCS CONCEPTS • Theory of computation → Evolutionary algorithms; Bioinspired optimization; Algorithm design techniques.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.