2011
DOI: 10.1007/978-3-642-21524-7_6
|View full text |Cite
|
Sign up to set email alerts
|

A Modified Multi-objective Binary Particle Swarm Optimization Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 12 publications
0
11
0
Order By: Relevance
“…The original particle swarm optimization (PSO) algorithm is best suited for an optimization of continuous problems, but several modifications [36,37] exist, which enable it to be used for discrete problems. In the case of multiple objectives which contradict each other, the PSO algorithm may be adapted to find a Pareto optimal set of solutions [38,39]. We used the Multi-objective particle swarm optimization (MOPSO) method proposed by Coello et.…”
Section: Imopso For Finding a Pareto Set Of Possible Service Distribu...mentioning
confidence: 99%
“…The original particle swarm optimization (PSO) algorithm is best suited for an optimization of continuous problems, but several modifications [36,37] exist, which enable it to be used for discrete problems. In the case of multiple objectives which contradict each other, the PSO algorithm may be adapted to find a Pareto optimal set of solutions [38,39]. We used the Multi-objective particle swarm optimization (MOPSO) method proposed by Coello et.…”
Section: Imopso For Finding a Pareto Set Of Possible Service Distribu...mentioning
confidence: 99%
“…Multi-objective Problems (MOPs) are known to have many contradictory objectives where enhancing the result of one objective will have a negative impact on the other objectives involved. MOPSO attempts to effectively find a solution or a set of solutions that ensure a balance between all the involved objectives as is thoroughly discussed in [28][29][30][31][32][33][34][35][36]. The main differences between the SOPSO and the MOPSO algorithms are:…”
Section: Particle Swarm Optimization (Pso)mentioning
confidence: 99%
“…To avoid filling up the leaders archive, a crowding distance based on the non-dominated sorting genetic algorithm-II (NSGA-II) is used to decide which particles must remain in the archive [38,39]; and 4. A mutation operator is applied to a portion of the swarm to improve the exploration and search ability and to avoid premature convergence [34,35,37]. Using the mutation method allowed us to give up using a simulated annealing method used to enhance the SOPSO performance by dynamically varying the inertia weight value.…”
Section: Particle Swarm Optimization (Pso)mentioning
confidence: 99%
“…Our second contribution will be to add to the evidence [3] [7] that PSOs operating in a discrete search space may find good solutions more easily without a traditional velocity update rule, particularly for multi-objective problems, as evidenced in [8]. Velocity-free algorithms appear superior in this sense.…”
Section: Introductionmentioning
confidence: 99%
“…in which p t i is the best parameter position found at time t by particle i, g t the best position found by any particle, c 1 and c 2 are positive constants most often set to 2.0 [8], r 1 and r 2 are random numbers ∈ [0, 1], and w is a usually iteration-decreasing "inertia weight" that controls the extent to which new search directions are pursued. It is also usually recommended to clip the velocity to a maximum magnitude V max so that the search remains within useful bounds.…”
Section: Introductionmentioning
confidence: 99%