2020
DOI: 10.20944/preprints202003.0381.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

Abstract: The Stochastic Optimisation Software (SOS) is a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. It reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including 1) … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 69 publications
0
13
0
Order By: Relevance
“…A brief description of the employed methods is reported in the reminder of this section. These optimisation methods are implemented and tested, with the parameter configurations suggested in their original papers, using the software platform [11]. Details on their implementations can be obtained from the source code made available in the online repository [24], thus facilitating the replicability of the results presented in this study.…”
Section: Methods Used Experimental Setupmentioning
confidence: 99%
See 2 more Smart Citations
“…A brief description of the employed methods is reported in the reminder of this section. These optimisation methods are implemented and tested, with the parameter configurations suggested in their original papers, using the software platform [11]. Details on their implementations can be obtained from the source code made available in the online repository [24], thus facilitating the replicability of the results presented in this study.…”
Section: Methods Used Experimental Setupmentioning
confidence: 99%
“…As more than abundant literature [11], [12] suggests, optimisation methods can be compared in the light of different aspects: best/average/worst performance, complexity, universality of application, memory usage, scalability, etc. Moreover, performance can be evaluated on a class of functions (with widely varying definitions of 'class') or over all possible problems (which is practically impossible but theoretically relevant in the context of global convergence proofs).…”
Section: Structural Biasmentioning
confidence: 99%
See 1 more Smart Citation
“…All algorithms refer to their persistent elitist variants. All experiments are executed on a standard desktop using the SOS platform [3] implemented in Java (algorithms' source code is available online). It is worth mentioning that the aforementioned pseudorandom generator used for all experiments is considered on the better side of the scale for linear congruential generators [15] 4 Discussion of results…”
Section: Methodsmentioning
confidence: 99%
“…The field of EAs is saturated with a multitude of nature inspired algorithms [2,3]. For practical reasons, these algorithms need to be compared and characterised.…”
Section: Structural Biasmentioning
confidence: 99%