Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation 2012
DOI: 10.1145/2330163.2330167
|View full text |Cite
|
Sign up to set email alerts
|

Ants easily solve stochastic shortest path problems

Abstract: The recent theoretical analysis (Horoba, Sudholt (GECCO 2010)) of an ant colony optimizer for the stochastic shortest path problem suggests that ant system experiences significant difficulties when the input data is prone to noise. In this work, we propose a slightly different ant optimizer to deal with noise.We prove that, under mild conditions, it finds the paths with shortest expected length efficiently, despite the fact that we do not have convergence in the classic sense. To prove our results, we introdu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
40
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 58 publications
(41 citation statements)
references
References 31 publications
1
40
0
Order By: Relevance
“…This gives a qualitative difference to [14] and [4], where the algorithms did not favor paths that are good in expectation, but instead paths either have low weights with reasonable probability (in [14]) or paths which come out better than others in direct comparison with probability higher than 0.5 (in [4]). As an illustration, consider the following (multi-) graph with random variables A, B and C as edge weights.…”
Section: Introductionmentioning
confidence: 93%
See 1 more Smart Citation
“…This gives a qualitative difference to [14] and [4], where the algorithms did not favor paths that are good in expectation, but instead paths either have low weights with reasonable probability (in [14]) or paths which come out better than others in direct comparison with probability higher than 0.5 (in [4]). As an illustration, consider the following (multi-) graph with random variables A, B and C as edge weights.…”
Section: Introductionmentioning
confidence: 93%
“…In [4], the MMAS-el algorithm was modified by reevaluating the best-so-far solution every iteration to avoid permanently being mislead by a single exceptionally good evaluation of a non-optimal solution. The paper shows that, in this case, the pheromones converge to the solution which has a better evaluation against any other solution in a direct comparison more than half of the time; however, such a solution need not exist, and even if it exists, it is not necessarily the solution optimal in expectation.…”
Section: Introductionmentioning
confidence: 99%
“…For example, for (1+1)-EA without noise, N 1 = 1 and N 2 = 1. Note that, for EAs under noise, we assume that the reevaluation strategy [13,14,18] is used, i.e., when accessing the fitness of a solution, it is always reevaluated. For example, for (1+1)-EA with sampling, both f…”
Section: Evolutionary Algorithms By Markov Chain Analysismentioning
confidence: 99%
“…For example, for (1+1)-EA without noise, N 1 = 1 and N 2 = 1. Note that, for EAs under noise, we assume that the reevaluation strategy [13,14,18] is used, i.e., when accessing the fitness of a solution, it is always reevaluated. For example, for (1+1)-EA with sampling, both f N k (x ′ ) and f N k (x) will be calculated and recalculated in each iteration; thus, N 1 = k and N 2 = 2k.…”
Section: Evolutionary Algorithms By Markov Chain Analysismentioning
confidence: 99%