2006
DOI: 10.1080/03052150600848000
|View full text |Cite
|
Sign up to set email alerts
|

On the use of metamodel-assisted, multi-objective evolutionary algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
58
0
3

Year Published

2006
2006
2023
2023

Publication Types

Select...
5
2
2

Relationship

4
5

Authors

Journals

citations
Cited by 96 publications
(61 citation statements)
references
References 20 publications
0
58
0
3
Order By: Relevance
“…Regarding the uncertain variables (flow conditions), it is assumed that these follow a normal distribution around an a-priori known average value and standard deviation. The general purpose optimization platform EASY (Evolutionary Algorithms SYstem, [2]) undertakes the optimization through Metamodel-Assisted Evolutionary Algorithms, MAEAs [14,15]. The overall workflow is outlined in Fig.…”
Section: The Proposed Workflow and Its Constituentsmentioning
confidence: 99%
“…Regarding the uncertain variables (flow conditions), it is assumed that these follow a normal distribution around an a-priori known average value and standard deviation. The general purpose optimization platform EASY (Evolutionary Algorithms SYstem, [2]) undertakes the optimization through Metamodel-Assisted Evolutionary Algorithms, MAEAs [14,15]. The overall workflow is outlined in Fig.…”
Section: The Proposed Workflow and Its Constituentsmentioning
confidence: 99%
“…In this paper, without loss in generality, radial basis function (RBF) networks (Haykin 1999) are used as metamodels. For efficient training algorithms of metamodels during the evolution of a M AEA the reader should refer to (Karakasis and Giannakoglou 2005). Training the metamodels requires a pool (or database) of samples, shared among all demes.…”
Section: Distributed Hierarchical Evolutionary Algorithmmentioning
confidence: 99%
“…the metamodel. Metamodels (Giannakoglou 2002, Jin 2005, Karakasis and Giannakoglou 2005, Lim et al 2008, Zhou et al 2007, Emmerich et al 2006 interpolation or approximation methods such as polynomial regression, artificial neural networks, etc, which, after being trained on previously seen solutions, are used in place of E 1 to screen out non-promising candidate solutions at very low CPU cost. In this manner, only a few of the best offspring undergo evaluations on E i , i > 0.…”
Section: Distributed Hierarchical Evolutionary Algorithmmentioning
confidence: 99%
“…In MAEAs, the role of metamodels is to provide approximations to the fitness or cost of candidate solutions and, thus, save a great amount of evaluations which would otherwise be carried out by the CPU-demanding problem-specific tool. In Karakasis and Giannakoglou (2005), Giannakoglou et al (2001), they are used to pre-evaluate the population members, in the so-called Inexact Pre-Evaluation (IPE ) phase of each generation of the MAEA. IPE is based on metamodels trained on the fly, on a small number of already evaluated individuals in the neighborhood of each inividual (local metamodels).…”
Section: Introductionmentioning
confidence: 99%