2008
DOI: 10.1007/s10732-008-9077-z
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary multiobjective optimization in noisy problem environments

Abstract: This paper presents a multiobjective evolutionary algorithm (MOEA) capable of handling stochastic objective functions. We extend a previously developed approach to solve multiple objective optimization problems in deterministic environments by incorporating a stochastic nondomination-based solution ranking procedure. In this study, concepts of stochastic dominance and significant dominance are introduced in order to better discriminate among competing solutions. The MOEA is applied to a number of published tes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0
1

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 39 publications
(33 citation statements)
references
References 28 publications
0
32
0
1
Order By: Relevance
“…However, this solution presents several drawbacks, such as: the necessity and difficulty to establish a vector of ideal weights; the lack of other relevant solutions of the Pareto-optimal set; and the difficulty to homogenize different quantities (e.g. : quality, cost, and time) to a common metric unity in a single objective [20].…”
Section: Background To Multiobjective Optimizationmentioning
confidence: 99%
“…However, this solution presents several drawbacks, such as: the necessity and difficulty to establish a vector of ideal weights; the lack of other relevant solutions of the Pareto-optimal set; and the difficulty to homogenize different quantities (e.g. : quality, cost, and time) to a common metric unity in a single objective [20].…”
Section: Background To Multiobjective Optimizationmentioning
confidence: 99%
“…Eskandari and Geiger [31] considered the expected values of the objective functions and proposed the stochastic dominance relation for ranking the solutions. In the selection process, the solutions are divided into two sets depending on whether they are stochastically dominated.…”
Section: Definition 2 (Dominance Probabilitymentioning
confidence: 99%
“…In tuning terms, this means that the utility of a parameter vectorx can only be estimated. The usual way of improving these estimates is to repeat the measurements (Hughes, 2001;University and Fieldsend, 2005;Eskandari and Geiger, 2009;Deb and Gupta, 2005), that is, to do more EA runs usingx, but this is clearly an expensive way of gaining more confidence. The main idea behind our technique is to do just one run with each parameter vectorx and to improve the confidence by looking at the utilities of similar parameter vectors in our archive, assessed before.…”
Section: Multi-function Evolutionary Tuning Algorithmmentioning
confidence: 99%