2014
DOI: 10.1007/978-3-319-10762-2_59
|View full text |Cite
|
Sign up to set email alerts
|

Distance-Based Analysis of Crossover Operators for Many-Objective Knapsack Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…Therefore, two nearly converged solutions may generate offspring far from the true Pareto front. This issue has also been confirmed in [6][7]. Furthermore, in the most common condition, when MOEAs are applied to solve many-objective optimization problems, they can only explore a small region compared with the large search space and restrictly focus on a limited area of the Pareto front because of the ineffective evolutionary operator.…”
Section: Introductionmentioning
confidence: 76%
“…Therefore, two nearly converged solutions may generate offspring far from the true Pareto front. This issue has also been confirmed in [6][7]. Furthermore, in the most common condition, when MOEAs are applied to solve many-objective optimization problems, they can only explore a small region compared with the large search space and restrictly focus on a limited area of the Pareto front because of the ineffective evolutionary operator.…”
Section: Introductionmentioning
confidence: 76%
“…Based on the analysis, the authors further suggested Controlling the maximum number of Crossed Genes (C-CG) in crossover operators, which could significantly improve the search performance of several MOEAs on MaOPs. Ishibuchi et al [17] observed that the clear performance improvement on many-objective knapsack problems is obtained when the parent-offspring distance is small. To further verify this observation, the authors implemented a distancebased crossover operator where the parent-offspring distance is specified as a user-defined parameter.…”
Section: Variation Operators In Multi and Manyobjective Optimizationmentioning
confidence: 98%
“…However, almost all the related algorithms in this regard adopted the conventional genetic operators to reproduce solutions, leaving room for the investigation of the other alternative variation operators, e.g., differential evolution (DE) [26]. On the other hand, several recent studies [24,14,17] have shown that the issue of performing an effective search in the decision space should be addressed more carefully when handling many objectives. Nevertheless, it is well known that pure genetic operators may not be powerful enough to achieve a good balance between exploration and exploitation in the decision space for some complex problems [30].…”
Section: Introductionmentioning
confidence: 99%
“…Second, the extensive search in a high-dimensional space would seriously undermine the efficiency of algorithmic operators, such as mating selection and variation [15]. As confirmed in [16,17], in a variation process, the new offspring produced by two nearly converged solutions, which are required to approach along its original direction, would contrarily move far away the true PF. This causes the failure of the final population to converge to the PF, despite spreading all over the objective space.…”
Section: Introductionmentioning
confidence: 99%