2009 IEEE Congress on Evolutionary Computation 2009
DOI: 10.1109/cec.2009.4983116
|View full text |Cite
|
Sign up to set email alerts
|

When is an estimation of distribution algorithm better than an evolutionary algorithm?

Abstract: Abstract-Despite the wide-spread popularity of estimation of distribution algorithms (EDAs), there has been no theoretical proof that there exist optimisation problems where EDAs perform significantly better than traditional evolutionary algorithms. Here, it is proved rigorously that on a problem called SUBSTRING, a simple EDA called univariate marginal distribution algorithm (UMDA) is efficient, whereas the (1+1) EA is highly inefficient. Such studies are essential in gaining insight into fundamental research… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
38
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 37 publications
(38 citation statements)
references
References 16 publications
0
38
0
Order By: Relevance
“…Moreover, the result of this paper is an extra example of applying our approach to analyze rigorously the behaviors of EDAs in addition to the three theorems presented in [1]. Moreover, recently we have also provided an answer to the second open question mentioned above: In [3], we prove that the so-called SUBSTRING problem is hard for the (1 + 1) EA while it is easy for the UMDA (without margins).…”
Section: Introductionmentioning
confidence: 77%
“…Moreover, the result of this paper is an extra example of applying our approach to analyze rigorously the behaviors of EDAs in addition to the three theorems presented in [1]. Moreover, recently we have also provided an answer to the second open question mentioned above: In [3], we prove that the so-called SUBSTRING problem is hard for the (1 + 1) EA while it is easy for the UMDA (without margins).…”
Section: Introductionmentioning
confidence: 77%
“…− O log n n 1/4 ≥ n −η , for some η = η(n) = o (1). Combining this with the probability of not exceeding 5/6, the probability of pj hitting the lower border within T iterations is, in any case, Ω(n −η ).…”
Section: Establishing the Lyapunov Conditionmentioning
confidence: 90%
“…The UMDA is an EDA that samples λ solutions each iteration, selects µ < λ best solutions, and then sets pi to the relative occurrence of 1s among these µ individuals. The algorithm has already been analyzed some years ago for several artificially designed example functions [1,2,3,4]. However, none these papers considers the most important benchmark function in theory, the OneMax function.…”
Section: Introductionmentioning
confidence: 99%
“…Another simple EDA is the Univariate Marginal Distribution Algorithm (UMDA) which was analysed in a series of papers [1,2,3,4]. UMDA was proposed in [17], assuming as the cGA independence between decision variables.…”
Section: Introductionmentioning
confidence: 99%