2019
DOI: 10.1007/978-3-030-29414-4_1
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic Tools for the Analysis of Randomized Optimization Heuristics

Abstract: This chapter collects several probabilistic tools that proved to be useful in the analysis of randomized search heuristics. This includes classic material like Markov, Chebyshev and Chernoff inequalities, but also lesser known topics like stochastic domination and coupling or Chernoff bounds for geometrically distributed random variables and for negatively correlated random variables. Most of the results presented here have appeared previously, some, however, only in recent conference publications. While the f… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
91
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 120 publications
(91 citation statements)
references
References 104 publications
0
91
0
Order By: Relevance
“…However, in some instances with particular δ = 50, the old model outperforms the new model. The solutions in NSGA-II with old and PS (10) and NSGA-II with new and PS (12) indicate that in most instances, the new model performs better than the old model when using the PS crossover operator.…”
Section: Experimental Results For Nsga-iimentioning
confidence: 99%
See 2 more Smart Citations
“…However, in some instances with particular δ = 50, the old model outperforms the new model. The solutions in NSGA-II with old and PS (10) and NSGA-II with new and PS (12) indicate that in most instances, the new model performs better than the old model when using the PS crossover operator.…”
Section: Experimental Results For Nsga-iimentioning
confidence: 99%
“…Chebyshev's inequality has a high utility because it can be applied to any probability distribution with known expectation and standard deviation of design variables. It also gives a tighter bound in comparison to the weaker tails such as Markov's inequality [12]. The standard Chebyshev's inequality is two-side and provides tails for upper and lower bounds [5].…”
Section: Surrogate Functions Based On Tail Boundsmentioning
confidence: 99%
See 1 more Smart Citation
“…By assuming c ≤ 1 and using the elementary estimate e x ≤ 1 + 2x valid for x ∈ [0, 1], see, e.g., Lemma 4.2(b) in [Doe18b], we have 1 − P + P exp( c µ ) ≤ 1 + 2P ( c µ ). Hence with P ≤ 2 n , c ≤ 1, µ ≥ 1, and ℓ ≤ n 320 , we obtain…”
Section: Resultsmentioning
confidence: 99%
“…where the last estimate stems from the additive Chernoff bound (e.g., Theorem 10.9 in [Doe18]). Using the estimates…”
Section: Summary Of Useful Toolsmentioning
confidence: 99%