2021
DOI: 10.1109/access.2021.3086559
|View full text |Cite
|
Sign up to set email alerts
|

A Competitive Particle Swarm Algorithm Based on Vector Angles for Multi-Objective Optimization

Abstract: Recently, the particle swarm algorithm (PSO) has demonstrated its effectiveness in solving multi-objective optimization problems. However, the performance of most existing multi-objective particle swarm algorithms depends largely on the global or individual best particles. Moreover, due to the rapid convergence of PSO in single objective optimization problems, PSO is prone to poorly distributed indicators when dealing with multi-objective optimization problems. To solve the above problems, we propose a multi-o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 69 publications
0
3
0
Order By: Relevance
“…The PSO algorithm draws inspiration from the behavior of a flock of birds searching for food. Imagine a scenario where there is only one food source in the area, and the objective of the flock is to locate this food source [11] . By collaborating and sharing information, the flock can determine whether they have found the optimal solution or not.…”
Section: Ant Colony Algorithmmentioning
confidence: 99%
“…The PSO algorithm draws inspiration from the behavior of a flock of birds searching for food. Imagine a scenario where there is only one food source in the area, and the objective of the flock is to locate this food source [11] . By collaborating and sharing information, the flock can determine whether they have found the optimal solution or not.…”
Section: Ant Colony Algorithmmentioning
confidence: 99%
“…The general form of a multi‐objective optimization problem is as follow [13], {MinF(x)goodbreak=[]f1(x),f2(x),,fM(x)H(x)goodbreak=[]h1(x),h2(x),,hJ(x)goodbreak=0G(x)goodbreak=[]g1(x),g2(x),,gK(x)0$$ \left\{\begin{array}{l}\operatorname{Min}F(x)=\left[{f}_1(x),{f}_2(x),\cdots, {f}_{\mathrm{M}}(x)\right]\\ {}H(x)=\left[{h}_1(x),{h}_2(x),\cdots, {h}_{\mathrm{J}}(x)\right]=0\\ {}G(x)=\left[{g}_1(x),{g}_2(x),\cdots, {g}_{\mathrm{K}}(x)\right]\le 0\end{array}\right. $$ where F ( x ) is the objective function, the functions f 1 , f 2 , …, f M map the D‐dimensional search space to the M‐dimensional objective space, x is the location vector in the D‐dimensional search space.…”
Section: Multi‐objectives Protection Settings Optimization Modelmentioning
confidence: 99%
“…Offline and online identification represent the primary categories of parameter identification [3]. Offline methods, such as genetic algorithms [4,5], firefly algorithms [6], and particle swarm optimization algorithms [7,8], necessitate extensive data collection and storage. However, offline techniques may not accurately capture the system's actual physical models under varying work conditions.…”
Section: Introductionmentioning
confidence: 99%