2017
DOI: 10.1109/tcyb.2015.2507366
|View full text |Cite
|
Sign up to set email alerts
|

Biased Multiobjective Optimization and Decomposition Algorithm

Abstract: Abstract-The bias feature is a major factor that makes a multiobjective optimization problem (MOP) difficult for multiobjective evolutionary algorithms (MOEAs). To deal with this problem feature, an algorithm should carefully balance between exploration and exploitation. The decomposition-based MOEA decomposes an MOP into a number of single objective subproblems and solves them in a collaborative manner. Single objective optimizers can be easily used in this algorithm framework. Covariance matrix adaptation ev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
52
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 159 publications
(52 citation statements)
references
References 35 publications
0
52
0
Order By: Relevance
“…The test problems designed for IM-MOEA BT1-BT9 [83] 2016 Multi-objective test problems with bias LSMOP1-LSMOP9 [84] 2016 Large-scale multi-objective test problems II. In addition, there are a lot of performance indicators provided by PlatEMO for experimental studies,…”
Section: Problemmentioning
confidence: 99%
“…The test problems designed for IM-MOEA BT1-BT9 [83] 2016 Multi-objective test problems with bias LSMOP1-LSMOP9 [84] 2016 Large-scale multi-objective test problems II. In addition, there are a lot of performance indicators provided by PlatEMO for experimental studies,…”
Section: Problemmentioning
confidence: 99%
“…For MOEA/D and MOEA/DD, the size of neighborhood T is set to ⌈0.1 × N ⌉ (with N denoting the population size), and the neighborhood selection probability δ is set to 0.9. In addition, for MOEA/D, the maximum number of solutions replaced by each offspring n r is set to ⌈0.01 × N ⌉, and the Tchebycheff approach (with transformed reference points [57]) is employed as the aggregation function. For RVEA and RVEA*, the penalty parameter α is set to 2, and the frequency of reference point adaption f r is set to 0.1.…”
Section: ) Parameters In the Compared Moeasmentioning
confidence: 99%
“…[11], [14], [20], [35] Multi-modality with brittle global optima and robust local optima It is possible to formulate a problem where each objective presents a global optimum and several closer sub-optimal solutions. However, in only one of the sub-optimal solutions, robustness is observed.…”
Section: Modalitymentioning
confidence: 99%