2013
DOI: 10.1162/evco_a_00094
|View full text |Cite
|
Sign up to set email alerts
|

Benchmarking Parameter-Free AMaLGaM on Functions With and Without Noise

Abstract: We describe a parameter-free estimation-of-distribution algorithm (EDA) called the adapted maximum-likelihood Gaussian model iterated density-estimation evolutionary algorithm (AMaLGaM-ID[Formula: see text]A, or AMaLGaM for short) for numerical optimization. AMaLGaM is benchmarked within the 2009 black box optimization benchmarking (BBOB) framework and compared to a variant with incremental model building (iAMaLGaM). We study the implications of factorizing the covariance matrix in the Gaussian distribution, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
54
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 47 publications
(54 citation statements)
references
References 16 publications
0
54
0
Order By: Relevance
“…The length of the long run is based on the time that is expected to be required to obtain high-quality results. Specifically, the length of the long run is based on a previously published scalability analysis 15 and allows 20×93.6× 1.81 evaluations where  is the number of parameters to optimize (7,803,795 evaluations for 25 grid points; 65,522,780 evaluations for 81 grid points). The short run is allowed only 1% of these evaluation budgets, implying a 100-fold speedup.…”
Section: Methodsmentioning
confidence: 99%
“…The length of the long run is based on the time that is expected to be required to obtain high-quality results. Specifically, the length of the long run is based on a previously published scalability analysis 15 and allows 20×93.6× 1.81 evaluations where  is the number of parameters to optimize (7,803,795 evaluations for 25 grid points; 65,522,780 evaluations for 81 grid points). The short run is allowed only 1% of these evaluation budgets, implying a 100-fold speedup.…”
Section: Methodsmentioning
confidence: 99%
“…We refer to this niching algorithm as Clustered AMaLGaM, or CAMaLGaM. AMaLGaM was chosen as the core search algorithm partly because of its robust performance [6], but mainly because there are only few algorithmic parameters that need to be transferred over generations, making a rst implementation relatively straightforward, allowing us to focus on the design and impact of using HGML.…”
Section: Clustered Amalgammentioning
confidence: 99%
“…A crucial parameter in real-valued EDAs is the distribution multiplier, which prevents premature convergence due to limited diversity in the selection [6,10]. We have to keep track of this and other algorithmic parameters over di erent generations and transfer them from mixture components in one generation to mixture components in the next generation.…”
Section: Connecting Mixture Components Over Generationsmentioning
confidence: 99%
See 2 more Smart Citations