2014 IEEE Congress on Evolutionary Computation (CEC) 2014
DOI: 10.1109/cec.2014.6900540
|View full text |Cite
|
Sign up to set email alerts
|

MOPSOhv: A new hypervolume-based multi-objective particle swarm optimizer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(16 citation statements)
references
References 21 publications
0
16
0
Order By: Relevance
“…The third sub-section compares our proposed HGLSS with respect to other leader selection strategies under the framework of the MOPSO algorithm. The last sub-section compares the performance of MOPSO-HGLSS with respect to nine popular population-based metaheuristics (SMPSO [25], dMOPSO [29], MOPSOhv [30], MaPSO [31], MOEA/D [32] NSGA-III [33], DBEA [34], RVEA [35] and ARMOEA [36]), in terms of IGD + with different scalable MOPs (using 3, 5, 8 and 10 objectives). Section V presents our conclusions and some possible paths for future research.…”
Section: Some Moeas Have Recently Been Proposed To Handlementioning
confidence: 99%
“…The third sub-section compares our proposed HGLSS with respect to other leader selection strategies under the framework of the MOPSO algorithm. The last sub-section compares the performance of MOPSO-HGLSS with respect to nine popular population-based metaheuristics (SMPSO [25], dMOPSO [29], MOPSOhv [30], MaPSO [31], MOEA/D [32] NSGA-III [33], DBEA [34], RVEA [35] and ARMOEA [36]), in terms of IGD + with different scalable MOPs (using 3, 5, 8 and 10 objectives). Section V presents our conclusions and some possible paths for future research.…”
Section: Some Moeas Have Recently Been Proposed To Handlementioning
confidence: 99%
“…At present, a lot of MOEAs adopted the global optimum selection strategy based on the nondominated sorting [10, 11] or Pareto dominance [32] or hypervolume [33, 34] or niche [35, 36] and so on. But they all have some problems of high selection pressure or low selection pressure.…”
Section: Related Work and Motivationmentioning
confidence: 99%
“…2) The MOPSOhv algorithm: MOPSOhv is a hypervolumebased multi-objective particle swarm algorithm. MOPSOhv proved to be effective in many-objective optimization problems [14]. From an operation point of view, MOPSOhv randomly initializes the swarm with N particles.…”
Section: B Multi-and Many-objective Optimizationmentioning
confidence: 99%
“…The optimization problem is solved in two steps. First, a hypervolume-based particle swarm multiobjective algorithm (i.e., MOPSOhv [14]) generates a set of Pareto optimal recruitment plans. Then, the best recruitment plan is selected by means of the TOPSIS algorithm.…”
Section: Introductionmentioning
confidence: 99%