2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/cec.2008.4631320
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic multi-swarm particle swarm optimizer with local search for Large Scale Global Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
95
0
1

Year Published

2009
2009
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 177 publications
(96 citation statements)
references
References 9 publications
0
95
0
1
Order By: Relevance
“…In Dynamic Multi-Swarm PSO (DMSPSO) by Liang and Suganthan [33], random re-definition of small-sized (3-5 particles) neighborhood occurs, thereby forcing exchange of information among the particles. Recently, the modified DM-SPSO with exploitation by Quasi-Newton method by Zhao et al [34]. achieved comparable to better results to other optimization algorithms in the large-scale optimization competition held at the CEC2008 conference.…”
Section: Improving Exploration and Exploitationmentioning
confidence: 90%
“…In Dynamic Multi-Swarm PSO (DMSPSO) by Liang and Suganthan [33], random re-definition of small-sized (3-5 particles) neighborhood occurs, thereby forcing exchange of information among the particles. Recently, the modified DM-SPSO with exploitation by Quasi-Newton method by Zhao et al [34]. achieved comparable to better results to other optimization algorithms in the large-scale optimization competition held at the CEC2008 conference.…”
Section: Improving Exploration and Exploitationmentioning
confidence: 90%
“…It has shown better performance on multi-modal problems but fails on making an efficient local search. A new DMS-PSO, incorporated with a quasi-newton method, is proposed here [37] for Large Scale Global Optimization problems. This quasi-newton method improves the local search ability of DMS-PSO.…”
Section: Memetic Algorithmsmentioning
confidence: 99%
“…It is used for preventing from trapping in local optimums. The same structure is used in [3,4] however a local search that employs Quasi-Newton method is used in it. In [4] this structure is applied for large-scale optimization.…”
Section: Introductionmentioning
confidence: 99%
“…The same structure is used in [3,4] however a local search that employs Quasi-Newton method is used in it. In [4] this structure is applied for large-scale optimization. Hanning Chen et all [5] designed hierarchical structure called hierarchical swarm optimization.…”
Section: Introductionmentioning
confidence: 99%