2016 European Control Conference (ECC) 2016
DOI: 10.1109/ecc.2016.7810572
|View full text |Cite
|
Sign up to set email alerts
|

On the premature convergence of particle swarm optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…In other words, once the particles move closer to the global best, which is a local optimum, they will lose an ability to search for the global optimum (Limlawan and Pongchairerks, 2010). This phenomenon is in fact a well-known problem of PSO (Larsen et al, 2016). Most of the proposed solution to mitigate this problem are problem-dependent but generally based on random-perturbation (Ünal and Kayakutlu, 2020) or hybrid approach by combining the strengths of the PSO and the other GA methods that allows for escaping local minima (Fan and Jen, 2019).…”
Section: Mitigating Underestimation Biasmentioning
confidence: 99%
“…In other words, once the particles move closer to the global best, which is a local optimum, they will lose an ability to search for the global optimum (Limlawan and Pongchairerks, 2010). This phenomenon is in fact a well-known problem of PSO (Larsen et al, 2016). Most of the proposed solution to mitigate this problem are problem-dependent but generally based on random-perturbation (Ünal and Kayakutlu, 2020) or hybrid approach by combining the strengths of the PSO and the other GA methods that allows for escaping local minima (Fan and Jen, 2019).…”
Section: Mitigating Underestimation Biasmentioning
confidence: 99%
“…PSO's advantages include being easy to implement, having only a few parameters to be set, having a fast convergence speed, being effective in global search, and being computationally inexpensive in terms of both memory and CPU requirements. However, even though PSO is an efficient and fast search algorithm, it also has some disadvantages, such as premature convergence problem, that is, it can be trapped into a local minimum in high-dimensional space, especially with complex problems [25].…”
Section: Proposed Algorithm Psogsmentioning
confidence: 99%
“…PSO is a fast algorithm that is easy to implement and is considered one of the most and widely used powerful algorithms for solving global optimization problems. Nevertheless, PSO has disadvantages, such as premature convergence and falling into a local minimum in high-dimensional space [25]. Similarly, GS is also easy to implement and understand, has high optimization accuracy, and has strong global optimization ability.…”
Section: Introductionmentioning
confidence: 99%
“…• DE and CMA-ES look, in general, to be preferable to PSO and BO; • BO is not competitive enough in both case studies and, in particular, its effectiveness deteriorates in the larger case study, probably because BO is not good enough for large dimensionalities (indeed, it is generally used to tune the few hyper-parameters of machine learning models [30]); • regarding robustness, (1 + 1)-ES is clearly the most robust algorithm; • the high variance for the results of the PSO algorithm may indicate the presence of a considerable amount of local optima in the fitness landscape, given that PSO is prone to prematurely converge to local optima [56];…”
Section: Algorithmmentioning
confidence: 99%