Thirty-Sixth Southeastern Symposium on System Theory, 2004. Proceedings of The
DOI: 10.1109/ssst.2004.1295638
|View full text |Cite
|
Sign up to set email alerts
|

Simulation of a new hybrid particle swarm optimization algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
34
0

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(34 citation statements)
references
References 5 publications
0
34
0
Order By: Relevance
“…The hybrid PSO was first proposed in 1998 [43], in which a standard selection mechanism is integrated with the PSO. A new hybrid gradient descent PSO (HGPSO), which is integrated with gradient information to achieve faster convergence without getting trapped in the local minima, is proposed by Noel and Jannett [16]. However, the computational demand of the HGPSO is increased by the process of the gradient descent.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…The hybrid PSO was first proposed in 1998 [43], in which a standard selection mechanism is integrated with the PSO. A new hybrid gradient descent PSO (HGPSO), which is integrated with gradient information to achieve faster convergence without getting trapped in the local minima, is proposed by Noel and Jannett [16]. However, the computational demand of the HGPSO is increased by the process of the gradient descent.…”
mentioning
confidence: 99%
“…The PSO's mutating space is dynamically varying along the search based on the properties of the wavelet function. The resulting mutation operation aids the hybrid PSO to perform more efficiently and provides faster convergence than the PSO with constriction and inertia weight factors [9] and other hybrid PSOs [1], [16], [17], [28] in solving a suite of benchmark test functions. In addition, it achieves better and more stable solution quality.…”
mentioning
confidence: 99%
“…The simples approach to calculate the gradient is HGPSO (Noel & Jannett 2004) is the numerical approximation of the gradient.…”
Section: B Improved Pso Algorithmmentioning
confidence: 99%
“…As with genetic algorithms some effort has gone into improving the performance of particle swarms in this area, with the introduction of both hybrid particle swarms which employ a local search, Shu-Kai et al (2004), or those which make specific use of gradient information in the velocity update equation, Noewl and Jannett (2004) and Ninomiya and Zhang (2008).…”
Section: The Basic Particle Swarmmentioning
confidence: 99%
“…Rather than use a local search strategy, Noewl and Jannett (2004) modified the particle swarm update equation to make use of any available gradient information by adopting a gradient descent term instead of a nostalgia term. The components of the updated velocity were therefore due to the inertia of each particle, the location of the global best point and a step in the steepest descent direction.…”
Section: Existing Hybridized Particle Swarmsmentioning
confidence: 99%