2014
DOI: 10.1155/2014/761403
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Adaptive Elite‐Based Particle Swarm Optimization Applied to VAR Optimization in Electric Power Systems

Abstract: Particle swarm optimization (PSO) has been successfully applied to solve many practical engineering problems. However, more efficient strategies are needed to coordinate global and local searches in the solution space when the studied problem is extremely nonlinear and highly dimensional. This work proposes a novel adaptive elite-based PSO approach. The adaptive elite strategies involve the following two tasks: (1) appending the mean search to the original approach and (2) pruning/cloning particles. The mean s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…The vectors Xpt ${X}_{p}^{t}$ and vpt+1 ${v}_{p}^{t+1}$ denote the position and velocity, respectively, of each particle p in a population residing within a limited search space. The inertia w plays a crucial role in the algorithm as it determines how the previously updated features are incorporated into the next iteration to influence convergence [30, 31]. The constants C 1 and C 2 are referred to as acceleration coefficients [32], while r1t ${r}_{1}^{t}$ and r2t ${r}_{2}^{t}$ represent random numbers ranging from 0 to 1.…”
Section: Deep Neural Network With Xgboostmentioning
confidence: 99%
“…The vectors Xpt ${X}_{p}^{t}$ and vpt+1 ${v}_{p}^{t+1}$ denote the position and velocity, respectively, of each particle p in a population residing within a limited search space. The inertia w plays a crucial role in the algorithm as it determines how the previously updated features are incorporated into the next iteration to influence convergence [30, 31]. The constants C 1 and C 2 are referred to as acceleration coefficients [32], while r1t ${r}_{1}^{t}$ and r2t ${r}_{2}^{t}$ represent random numbers ranging from 0 to 1.…”
Section: Deep Neural Network With Xgboostmentioning
confidence: 99%
“…The particle swarm optimization was developed to optimally search for the local best and the global best; these searches are frequently known as the exploitation and exploration of the problem space, respectively. Hong et al [21] stated that exploitation involves an intense search of particles in a local region while exploration is a long term search, whose main objective is to find the global optimum of the fitness function. Although particle swarm optimization rapidly searches the solution of many complex optimization problems, it suffers from premature convergence, trapping at a local minimum, the slowing down of convergence near the global optimum, and stagnation in a particular region of the problem space especially in a multimodal functions and high-dimensional problem space.…”
Section: Introductionmentioning
confidence: 99%