2017
DOI: 10.1007/s11721-017-0150-9
|View full text |Cite
|
Sign up to set email alerts
|

Self-adaptive particle swarm optimization: a review and analysis of convergence

Abstract: Particle swarm optimization (PSO) is a population-based, stochastic search algorithm inspired by the flocking behaviour of birds. The PSO algorithm has been shown to be rather sensitive to its control parameters and thus performance may be greatly improved by employing appropriately tuned parameters. However, parameter tuning is typically a time-intensive empirical process. Furthermore, a priori parameter tuning makes the implicit assumption that the optimal parameters of the PSO algorithm are not timedependen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
36
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 112 publications
(49 citation statements)
references
References 46 publications
0
36
0
1
Order By: Relevance
“…To alleviate the issue of time-sensitive parameter values, various self-adaptive particle swarm optimization (SAPSO) algorithms that adapt their control parameters over time have been proposed [12,18,19,20,21,22,23]. While many such adaptive schemes have been proposed, their performance has largely been unconvincing [24,25,26,27]. However, the poor performance of SAPSO algorithms can be somewhat explained by the fact that such algorithms are concurrently optimizing two highly inter-dependent continuous search problems.…”
Section: Introductionmentioning
confidence: 99%
“…To alleviate the issue of time-sensitive parameter values, various self-adaptive particle swarm optimization (SAPSO) algorithms that adapt their control parameters over time have been proposed [12,18,19,20,21,22,23]. While many such adaptive schemes have been proposed, their performance has largely been unconvincing [24,25,26,27]. However, the poor performance of SAPSO algorithms can be somewhat explained by the fact that such algorithms are concurrently optimizing two highly inter-dependent continuous search problems.…”
Section: Introductionmentioning
confidence: 99%
“…Several studies [117]- [119] have attempted to dynamically adjust these parameters to achieve the mentioned characteristics of the PSO for obtaining an optimal solution of the studied optimization problem. In [120], the authors have extensively reviewed the literature to address the convergence behaviors of 18 different self-adaptive PSO algorithms both empirically and analytically. However, in the majority of the PSO based MG controllers, these parameters are chosen as constant values throughout the simulation time as shown in Table 3.…”
Section: A Improved Versions Of Psomentioning
confidence: 99%
“…In HO, a shifted cosine function is used to control the elimination rate: v r = 0.5 * cos(ite * π/maxIte) + 0.5 (4) Then adjust N ite and Q ite by…”
Section: B Procedures Of the Proposed Algorithmmentioning
confidence: 99%
“…In the past decades, many population-based stochastic search methods have been proposed, such as genetic algorithm (GA) [2], differential evolution (DE) [3], particle swarm optimization (PSO) [4], ant colony optimization (ACO) [5], artificial fish (AF) [6], artificial bee colony (ABC) [7], simulated annealing (SA) [8], etc. GA includes three main evolution strategies, i.e., selection, crossover and mutation.…”
Section: Introductionmentioning
confidence: 99%