2018 IEEE 38th International Conference on Distributed Computing Systems (ICDCS) 2018
DOI: 10.1109/icdcs.2018.00031
|View full text |Cite
|
Sign up to set email alerts
|

PEA: Parallel Evolutionary Algorithm by Separating Convergence and Diversity for Large-Scale Multi-Objective Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(9 citation statements)
references
References 36 publications
0
9
0
Order By: Relevance
“…There are also some attempts to customize search operators to improve the scalability of certain MOEAs [96,97] . The work in [98] focuses on a specific type of large-scale problems named large-scale sparse MOPs, where most decision variables of the optimal solutions are zero. To deal with this kind of problem and improve the sparsity of the generated solutions, a new population initialization strategy and genetic operators by taking the sparse nature of the Pareto optimal solutions into consideration are proposed.…”
Section: Enhanced Search-based Large-scale Moeasmentioning
confidence: 99%
“…There are also some attempts to customize search operators to improve the scalability of certain MOEAs [96,97] . The work in [98] focuses on a specific type of large-scale problems named large-scale sparse MOPs, where most decision variables of the optimal solutions are zero. To deal with this kind of problem and improve the sparsity of the generated solutions, a new population initialization strategy and genetic operators by taking the sparse nature of the Pareto optimal solutions into consideration are proposed.…”
Section: Enhanced Search-based Large-scale Moeasmentioning
confidence: 99%
“…Then, different optimization strategies are adopted in the two groups to focus on the convergence and diversity, respectively. Recently, several parallelization techniques have been utilized in the grouping of decision variables to speed up the search process [27]- [29].…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, the diversity-related variables help LSMOEAs find the solution sets with a better distribution. Existing decision-variable grouping strategies can be divided into fixed grouping strategies [ 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 ] and dynamic grouping strategies [ 21 , 22 , 23 , 24 ]. In a fixed grouping strategy, the grouping results do not change during the evolution process, i.e., the evolutionary algorithm for large-scale many-objective optimization, LMEA.…”
Section: Introductionmentioning
confidence: 99%