2011
DOI: 10.4028/www.scientific.net/amr.186.454
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Particle Swarm Optimization Algorithm

Abstract: An improved particle swarm optimization (IPSO) was proposed in this paper to solve the problem that the linearly decreasing inertia weight (LDIW) of particle swarm optimization algorithm cannot adapt to the complex and nonlinear optimization process. The strategy of nonlinear decreasing inertia weight based on the concave function was used in this algorithm. The aggregation degree factor of the swarm was introduced in this new algorithm. And in each iteration process, the weight is changed dynamically based on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 2 publications
0
10
0
Order By: Relevance
“…Furthermore, the PSO is simple and easy to implement, which does not have many parameters to adjust and does not need gradient information. PSO is also shown to be better in dealing with nonlinear continuous optimization problems, combinatorial optimization problems, and mixed-integer nonlinear optimization problems [56]. In this study, for the parameter identification of PD model, it is impossible to determine the specific range of initial parameters for identification because the parameters of these virtual patients are different.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, the PSO is simple and easy to implement, which does not have many parameters to adjust and does not need gradient information. PSO is also shown to be better in dealing with nonlinear continuous optimization problems, combinatorial optimization problems, and mixed-integer nonlinear optimization problems [56]. In this study, for the parameter identification of PD model, it is impossible to determine the specific range of initial parameters for identification because the parameters of these virtual patients are different.…”
Section: Discussionmentioning
confidence: 99%
“…This section outlines the 23 classical test functions that were employed for experiments. The performance of DESMAOA was compared with two newly proposed algorithms (SMA and AOA) and another five very famous optimization algorithms (GWO [38], WOA [39], SSA [40], MVO [5], and PSO [12]). Table 4 lists the main parameter values used in each algorithm.…”
Section: The Classical Benchmark Functionsmentioning
confidence: 99%
“…Swarm-inspired: particle swarm optimization (PSO) [12], emperor penguin optimizer (EPO) [13], Aquila optimizer (AO) [14], remora optimization algorithm (ROA) [4], marine predators algorithm (MPA) [15].…”
mentioning
confidence: 99%
“…We have collected eight ensembles near the phase transition region with 2 + 1 flavors of fermions. All the simulations have a 16 3 × 8 space-time volume and a fifth dimension of L S = 32 or 48 and they all lie on a line of constant physics with m π ≈ 200MeV and kaon mass almost physical [7]. Table 1 gives the basic parameters of these finite-temperature ensembles.…”
Section: Implementation Detailsmentioning
confidence: 99%