2021
DOI: 10.1109/access.2020.3047699
|View full text |Cite
|
Sign up to set email alerts
|

Decomposition-Based Multiobjective Evolutionary Algorithm With Genetically Hybrid Differential Evolution Strategy

Abstract: In the decomposition-based multiobjective evolutionary algorithms (MOEA/Ds), a set of subproblems are optimized by using the evolutionary search to exploit the feasible regions. In recent studies of MOEA/Ds, it was found that the design of recombination operators would significantly affect their performances. Therefore, this paper proposes a novel genetically hybrid differential evolution strategy (GHDE) for recombination in MOEA/Ds, which works effectively to strengthen the search capability. Inspired by the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 52 publications
0
2
0
Order By: Relevance
“…MOEA/D-FRRMAB [117], MOEA/D-UCB-Tuned and MOEA/D-UCB-V [55] Adaptive operator selection NSGA-III AP and NSGA-III PM [58], SaMOEA/D [184], MOEA/D-DYTS [198] MPADE [25], MOEA/D-CDE [126], AMOEA/D [127] Parameter control MOEA/D-GHDE [140], EF-PD [218], MOEA/D-IMA [231] Ensemble of operators MAB-HH [45], MOEA/D-HH {SW} [56], MOEA/D-HH [57], MOEA/D-DRA-UCB [181] Hyper-heuristics BoostMOEA/D-DRA [182], MOEA/D-LinUCB [54], MAB-based HH [27] Mathematical programming MOEA/D-LS [147], MOEA/D-RBF [149], MONSS [150], NSS-MO [152] Direct search method MOEA/D-QS [180], MOEA/D-NMS [245] NSGA-III-KKTPM [2,3,191] KKTPM Memetic search MOMAD [96], PPLS/D [193] Pareto local search CoMOLS/D [15], MONSD [73], MOMNS/D and MOMNS/V [211], LS/D [258] Neighborhood search MOEA/D-TS [5], MOEA-MA [38], DMTS [259], MOEA/D-GLS [4] EMOSA [109,110], MOMA-SA [14], EOMO [90] Simulated annealing MOEA/D-GRASP [6], HEMH [94]…”
Section: Aos and Hhmentioning
confidence: 99%
See 1 more Smart Citation
“…MOEA/D-FRRMAB [117], MOEA/D-UCB-Tuned and MOEA/D-UCB-V [55] Adaptive operator selection NSGA-III AP and NSGA-III PM [58], SaMOEA/D [184], MOEA/D-DYTS [198] MPADE [25], MOEA/D-CDE [126], AMOEA/D [127] Parameter control MOEA/D-GHDE [140], EF-PD [218], MOEA/D-IMA [231] Ensemble of operators MAB-HH [45], MOEA/D-HH {SW} [56], MOEA/D-HH [57], MOEA/D-DRA-UCB [181] Hyper-heuristics BoostMOEA/D-DRA [182], MOEA/D-LinUCB [54], MAB-based HH [27] Mathematical programming MOEA/D-LS [147], MOEA/D-RBF [149], MONSS [150], NSS-MO [152] Direct search method MOEA/D-QS [180], MOEA/D-NMS [245] NSGA-III-KKTPM [2,3,191] KKTPM Memetic search MOMAD [96], PPLS/D [193] Pareto local search CoMOLS/D [15], MONSD [73], MOMNS/D and MOMNS/V [211], LS/D [258] Neighborhood search MOEA/D-TS [5], MOEA-MA [38], DMTS [259], MOEA/D-GLS [4] EMOSA [109,110], MOMA-SA [14], EOMO [90] Simulated annealing MOEA/D-GRASP [6], HEMH [94]…”
Section: Aos and Hhmentioning
confidence: 99%
“…In addition to AOS, [142][143][144] also consider adaptively control the parameters associated with the reproduction operator in order to achieve the best algorithm setup. Rather than adaptively selecting the "appropriate" reproduction operator on the fly, another line of research (e.g., [145][146][147]) is to build an ensemble of reproduction operators and use them simultaneously.…”
Section: Adaptive Operator Selection and Hyper-heuristicsmentioning
confidence: 99%