2010
DOI: 10.1007/978-3-642-10701-6_11
|View full text |Cite
|
Sign up to set email alerts
|

Improving Local Convergence in Particle Swarms by Fitness Approximation Using Regression

Abstract: Abstract. In this chapter we present a technique that helps Particle Swarm Optimisers (PSOs) locate an optimum more quickly, through fitness approximation using regression. A least-squares regression is used to estimate the shape of the local fitness landscape. From this shape, the expected location of the peak is calculated and the information given to the PSO. By guiding the PSO to the optimum, the local convergence speed can be vastly improved. We demonstrate the effectiveness of using regression on several… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 33 publications
0
9
0
Order By: Relevance
“…This defines the margin of separation, the width of which is given by . In order to simplify the optimization problem without changing the solution, SVM will be used to minimize (7) subject to (8) In Eq. (7), the parameter " " is introduced for mathematical convenience only.…”
Section: B Support Vector Machinementioning
confidence: 99%
See 1 more Smart Citation
“…This defines the margin of separation, the width of which is given by . In order to simplify the optimization problem without changing the solution, SVM will be used to minimize (7) subject to (8) In Eq. (7), the parameter " " is introduced for mathematical convenience only.…”
Section: B Support Vector Machinementioning
confidence: 99%
“…Margarita and Coello [6] proposed to incorporate the concept of fitness inheritance into a multi-objective particle swarm optimization in order to reduce the fitness evaluations. Bird and Li [7] proposed a fitness approximation technique using a least-square regression to find the expected location of the peak and improve the local convergence speed. Parno et al [8] incorporated surrogates constructed using Design and Analysis of Computer Experiments (DACE) as a stand-in for the expensive objective function within the PSO framework.…”
Section: Introductionmentioning
confidence: 99%
“…Attempts have also been made to hybridize niching methods with local search procedures, in order to enhance convergence to multiple optima, or in other words, distributed convergence. For example, regression was incorporated into Speciationbased PSO (i.e., rSPSO) for improving local convergence on both static and dynamic multi-modal landscapes in [69]. The faster and more accurate local convergence is achieved by using regression computed based on only a handful of existing individuals in the population.…”
Section: Hybrid Methodsmentioning
confidence: 99%
“…A simple regression method with Speciation-based PSO (so called rSPSO) [69], [120] shows significantly better performance than several other multi-population methods such as mQSO [42]. The regression method can be substituted by other surrogate models such as Kriging [121].…”
Section: B Dynamic Optimizationmentioning
confidence: 99%
“…Parno et al [26] used a Kriging surrogate to improve the efficiency of PSO for simulation-based problems and applied it to a 6-D groundwater management problem. Bird and Li [27] incorporated a regression model into PSO algorithm in order to improve local convergence. Tang et al [28] used a hybrid global surrogate model consisting of a quadratic polynomial and an RBF model to develop a surrogate-based PSO method, which was applied to lowdimensional test problems and engineering design problems.…”
Section: Introductionmentioning
confidence: 99%