2007
DOI: 10.1016/j.ipl.2006.10.005
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic convergence analysis and parameter selection of the standard particle swarm optimization algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
236
1
6

Year Published

2009
2009
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 426 publications
(249 citation statements)
references
References 6 publications
6
236
1
6
Order By: Relevance
“…Since the convergence analysis in [8] implies that the general movement of a converging particle swarm drops exponentially, a logarithmic scale is used and linear decrease is expected.…”
Section: Potential and Stagnation Phasesmentioning
confidence: 99%
“…Since the convergence analysis in [8] implies that the general movement of a converging particle swarm drops exponentially, a logarithmic scale is used and linear decrease is expected.…”
Section: Potential and Stagnation Phasesmentioning
confidence: 99%
“…• In [15], a convergence-related parameter adjustment scheme (CRPAS) has been proposed based on the convergence condition [9] of the PSO algorithm. In the reference, 9 time-varying parameter schemes are investigated in the numerical experiments, and two schemes among them outperforms the others including the LDIWM and TVAC.…”
Section: Analysis Of Pso Algorithms With Time-varying Parametersmentioning
confidence: 99%
“…However, these approaches make assumptions that the random variables either are constant or have some deterministic constraints, which leads to the limitations of the results. In contrast, the analysis results in [9,10] have been derived without such an assumption. As a result, the parameter region in [9,10] for the PSO algorithm to converge is very similar to the parameter region in [11] empirically derived from many numerical experiments.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The swarm technique has proven to outperform GA due to many reasons. Actually, through a good selection of the swarm parameters, the rate of convergence can be controlled [16,17]. Also, it is simpler than GA and much easier to be adjusted to obtain faster convergence [18,19].…”
Section: Introductionmentioning
confidence: 99%