Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826)
DOI: 10.1109/icmlc.2004.1382171
|View full text |Cite
|
Sign up to set email alerts
|

Research on particle swarm optimization: a review

Abstract: Particle Swarm Optimization @SO) explores global optimal solution through exploiting the particle's memory and the swam's memory. Its properties of low constraint on the continuity of objective function and joint of search space, and ability of adapting to dynamic environment make PSO become one of the most important Swam Intelligence methods and Evolutionary Computation algorithms. The fundamental and standard algorithm is introduced firstly. Then the work OD the algorithm improvement during the past years is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(22 citation statements)
references
References 22 publications
0
22
0
Order By: Relevance
“…Particle swarm optimization (PSO) applies variations to a population (swarm) of candidate solutions with respect to a given fitness function to approach a desired solution [8]. In this iterative procedure, the specific local solutions are modified 1041-1135/$25.00 © 2008 IEEE at each iteration based on the best global solution, the previous-best local solution, and its last modification (i.e., inertia).…”
Section: Design Methodsmentioning
confidence: 99%
“…Particle swarm optimization (PSO) applies variations to a population (swarm) of candidate solutions with respect to a given fitness function to approach a desired solution [8]. In this iterative procedure, the specific local solutions are modified 1041-1135/$25.00 © 2008 IEEE at each iteration based on the best global solution, the previous-best local solution, and its last modification (i.e., inertia).…”
Section: Design Methodsmentioning
confidence: 99%
“…Thus, one may note that the PSO is implemented through an iterative algorithm. Each particle is initially placed at different hyperspaces and, after each iteration (season), they move around, with an update speed factor, towards the solution of the problem [21]. It is important to notice that all of the hyperspaces are mapped on the domain of the problem and, as it has already been stated, the problem itself is to find the minimum point of a user-defined cost function.…”
Section: Psomentioning
confidence: 99%
“…Other point noteworthy is the role of inertia weight w. The greater this coefficient is, the wider the domain is explored. In other words, higher values of w promotes better the searching for global minimum of the cost function [21]. Nonetheless, there are a couple of issues that can arise from choosing huge values of ω. Firstly, as the ω grows the speed of convergence of the particles is slowed and a tread-off between the quality and the speed of the searching is raised.…”
Section: Psomentioning
confidence: 99%
“…Particle swarm optimization (PSO) originates from the natural behavior of a flock of birds, a school of fish, or a swarm of bee. James Kennedy and Russell Eberhart are the PSO founders which use optimized nonlinear functions [30,31] in this technique. By applying bioinspired algorithm, constraint or unconstraint optimization can be solved efficiently and faster.…”
Section: Particle Swarm Optimization Algorithmmentioning
confidence: 99%