2022
DOI: 10.1007/978-3-031-14721-0_9
|View full text |Cite
|
Sign up to set email alerts
|

Do We Really Need to Use Constraint Violation in Constrained Evolutionary Multi-objective Optimization?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 43 publications
0
2
0
Order By: Relevance
“…Another missing, yet important issue, is how to handle constraints in the context of multiple objectives [78,79]. Things become even more challenging when the constraints are (partially) unobservable [80,81]. It is also worth noting that evolutionary computation and multi-objective optimization have been successfully applied to solve real-world problems, e.g., natural language processing [82], neural architecture search [83][84][85][86], robustness of neural networks [87][88][89][90][91][92], software engineering [93][94][95][96][97], smart grid management [2,98,99], communication networks [100][101][102][103], machine learning [104][105][106][107][108], and visualization [109].…”
Section: Discussionmentioning
confidence: 99%
“…Another missing, yet important issue, is how to handle constraints in the context of multiple objectives [78,79]. Things become even more challenging when the constraints are (partially) unobservable [80,81]. It is also worth noting that evolutionary computation and multi-objective optimization have been successfully applied to solve real-world problems, e.g., natural language processing [82], neural architecture search [83][84][85][86], robustness of neural networks [87][88][89][90][91][92], software engineering [93][94][95][96][97], smart grid management [2,98,99], communication networks [100][101][102][103], machine learning [104][105][106][107][108], and visualization [109].…”
Section: Discussionmentioning
confidence: 99%
“…This observation implies that the transfer learning approaches used in other peer algorithms might lead to negative effect at the early stage of the hyperparameter optimization at each time step. Whereas a simple restarting from scratch is more robust at the early stage while its performance might be stagnated afterwards [125][126][127][128].…”
Section: Real-world Dynamic Optimizationmentioning
confidence: 99%