For genetic algorithms using a bit-string representation of length n, the general recommendation is to take 1/n as mutation rate. In this work, we discuss whether this is really justified for multimodal functions. Taking jump functions and the (1 + 1) evolutionary algorithm as the simplest example, we observe that larger mutation rates give significantly better runtimes. For the Jump m,n function, any mutation rate between 2/n and m/n leads to a speed-up at least exponential in m compared to the standard choice.The asymptotically best runtime, obtained from using the mutation rate m/n and leading to a speed-up super-exponential in m, is very sensitive to small changes of the mutation rate. Any deviation by a small (1 ± ε) factor leads to a slow-down exponential in m. Consequently, any fixed mutation rate gives strongly sub-optimal results for most jump functions.Building on this observation, we propose to use a random mutation rate α/n, where α is chosen from a power-law distribution. We prove that the (1 + 1) EA with this heavy-tailed mutation rate optimizes any Jump m,n function in a time that is only a small polynomial (in m) factor above the one stemming from the optimal rate for this m.Our heavy-tailed mutation operator yields similar speed-ups (over the best known performance guarantees) for the vertex cover problem in bipartite graphs and the matching problem in general graphs.Following the example of fast simulated annealing, fast evolution strategies, and fast evolutionary programming, we propose to call genetic algorithms using a heavy-tailed mutation operator fast genetic algorithms.
The (1 + (λ, λ)) genetic algorithm (GA) proposed in [Doerr, Doerr, and Ebel. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science (2015)] is one of the few examples for which a super-constant speed-up of the expected optimization time through the use of crossover could be rigorously demonstrated. It was proven that the expected optimization time of this algorithm on OneMax is O(max{n log(n)/λ, λn}) for any offspring population size λ ∈ {1, . . . , n} (and the other parameters suitably dependent on λ) and it was shown experimentally that a self-adjusting choice of λ leads to a better, most likely linear, runtime.In this work, we study more precisely how the optimization time depends on the parameter choices, leading to the following results on how to optimally choose the population size, the mutation probability, and the crossover bias both in a static and a dynamic fashion.For the mutation probability and the crossover bias depending on λ as in [DDE15], we improve the previous runtime bound to O(max{n log(n)/λ, nλ log log(λ)/ log(λ)}). This expression is minimized by a value of λ slightly larger than what the previous result suggested and gives an expected optimization time of O n log(n) log log log(n)/ log log(n) .We show that no static choice in the three-dimensional parameter space of offspring population, mutation probability, and crossover bias gives an asymptotically better runtime.Results presented in this work are based on [12][13][14].
B. DoerŕEcole Polytechnique, LIX -UMR 7161,
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.