2009 IEEE Congress on Evolutionary Computation 2009
DOI: 10.1109/cec.2009.4983359
|View full text |Cite
|
Sign up to set email alerts
|

On simultaneous perturbation particle swarm optimization

Abstract: In this paper, we describes the simultaneous perturbation particle swarm optimization which is a combination of the particle swarm optimization and the simultaneous perturbation optimization method. The method has global search capability of the particle swarm optimization and local search one of gradient method by the simultaneous perturbation. Some variations of the method are described. Comparison between these methods and the ordinary particle swarm optimization are shown through five test functions and le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2011
2011
2025
2025

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…J.Alespector et al and G.Cauwenberghs also proposed a parallel gradient descent method and stochastic errordescent algorithm, respectively, which are identical to the simultaneous perturbation learning rule [8,9]. Many applications of the simultaneous perturbation are reported in the fields of neural networks [10] and their hardware implementation [11,12].…”
Section: Simultaneous Perturbation Methodsmentioning
confidence: 97%
See 2 more Smart Citations
“…J.Alespector et al and G.Cauwenberghs also proposed a parallel gradient descent method and stochastic errordescent algorithm, respectively, which are identical to the simultaneous perturbation learning rule [8,9]. Many applications of the simultaneous perturbation are reported in the fields of neural networks [10] and their hardware implementation [11,12].…”
Section: Simultaneous Perturbation Methodsmentioning
confidence: 97%
“…Combined methods of particle swarm optimization and the simultaneous perturbation is proposed by Y.Maeda [12]. In this work, the update algorithm which is a combination of particle swarm optimization and the simultaneous perturbation is applied for all the particles uniformly.…”
Section: Simultaneous Perturbationmentioning
confidence: 99%
See 1 more Smart Citation
“…However, in standard PSO, the local information of an objective function is not used, such as gradient (Maeda and Matsushita, 2008; Maeda et al , 2009). This means even if a search point of PSO exists around the optimal value, particle passes the point, as local information is not considered.…”
Section: Mfastslam Using Soft Computingmentioning
confidence: 99%
“…Therefore, these two points are promising advantages as same as the genetic algorithm. If PSO could utilize any local information without complicated calculation or direct calculation of the gradient, efficiency of the algorithm will be better without lack of the previous two advantages of PSO (Maeda & Kuratani, 2006).…”
Section: Introductionmentioning
confidence: 99%