2013
DOI: 10.1007/978-3-642-33206-7_10
|View full text |Cite
|
Sign up to set email alerts
|

Experimental Analysis of Optimization Algorithms: Tuning and Beyond

Abstract: This chapter comprises the essence of several years of tutorials the authors gave on experimental research in evolutionary computation. We highlight the renaissance of experimental techniques also in other fields to especially focus on the specific conditions of experimental research in computer science, or more concrete, metaheuristic optimization. The experimental setup is discussed together with the pitfalls awaiting the unexperienced (and sometimes even the experienced). We present a severity criterion as … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(16 citation statements)
references
References 49 publications
0
16
0
Order By: Relevance
“…Actually, a full factorial design analyzed by means of ANOVA can be considered as the first step in an algorithm calibration. For more exhaustive approaches, the reader is referred to Bartz-Beielstein et al (2010) where advanced techniques are shown. The reason behind our choice of a simple calibration is none other than to avoid an unfair comparison with existing approaches.…”
Section: Calibration Of the Proposed Scatter Search Methodsmentioning
confidence: 99%
“…Actually, a full factorial design analyzed by means of ANOVA can be considered as the first step in an algorithm calibration. For more exhaustive approaches, the reader is referred to Bartz-Beielstein et al (2010) where advanced techniques are shown. The reason behind our choice of a simple calibration is none other than to avoid an unfair comparison with existing approaches.…”
Section: Calibration Of the Proposed Scatter Search Methodsmentioning
confidence: 99%
“…Examples of these efforts include the machine learning based method proposed in [BHBG07], CPLEX automatic tuning tool [CPL14], use of derivative-free optimization [AO06], ParamILS [HHLBS09], and the procedure proposed in [HHLB10] for mixed integer programming solvers. Similarly, some of the tuning techniques for non-deterministic methods include sequential parameter optimization (SPO) [BBLP05,BBP14], relevance and calibration approach [NE06], and F-Race [Bir09].…”
Section: Parameter Tuning and Stopping Conditionsmentioning
confidence: 99%
“…These can customize an initial algorithm setup for a given problem off-line (before the run), or on-line (during the run) 63 . Techniques such as automated parameter tuning 64,65,66,67 and adaptive parameter control continue to make advances in this area 68,69,70,71 .…”
Section: Automated Design and Tuning Of Easmentioning
confidence: 99%