2008
DOI: 10.1007/978-3-540-85984-0_29
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary Optimization with Dynamic Fidelity Computational Models

Abstract: Abstract. In optimization, it is now a common practice to use lower fidelity computational models in place of the original model when dealing with problems with computationally expensive objective functions. In this paper, we present a study on evolutionary optimization with dynamic fidelity computational models capable of acclimatizing to localized complexity, for enhancing design search efficiency. In particular, we propose an evolutionary framework for model fidelity control that decides, at runtime, the ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 23 publications
(18 citation statements)
references
References 20 publications
(30 reference statements)
0
18
0
Order By: Relevance
“…Lim et al [32] use a lower fidelity model to locally optimize each solution in the population. The fidelity used is decided based on a user defined threshold η that specifies the minimum correlation required between the model chosen and the highest fidelity model.…”
Section: A Evolutionary Optimization Using Surrogate Modelsmentioning
confidence: 99%
“…Lim et al [32] use a lower fidelity model to locally optimize each solution in the population. The fidelity used is decided based on a user defined threshold η that specifies the minimum correlation required between the model chosen and the highest fidelity model.…”
Section: A Evolutionary Optimization Using Surrogate Modelsmentioning
confidence: 99%
“…The best performing individuals from the bottom layer would then migrate into a top layer for more exact evaluation, and vice versa. Also in the class, Lim et al [23] explored an evolutionary based optimization algorithm which adjusted the simulation fidelity during the search, and based on the correlation between adjacent members in the population of candidate solutions.…”
Section: Expensive Optimization Problemsmentioning
confidence: 99%
“…Φi,j = φj(x x xi)(23) x x xi, i = 1, • • • , n, are the sample vectors, and f f f is the vector of corresponding objective function values. As mentioned in Section 3, the proposed algorithm used an radial basis functions network withn = 0.8 n, i.e., the number of neurons was 80 % of the sample size.…”
mentioning
confidence: 99%
“…To enhance the accuracies of the surrogates used, researchers have turned to localized , Emmerich et al, 2002, Ong et al, 2003, Regis and Shoemaker, 2004 as opposed to globalized models or their synergies [Zhou et al, 2005]. Others have also considered the use of gradient information [Ong et al, 2004] to enhance the prediction accuracy of the constructed surrogate models or physics-based models that are deem to be more trustworthy than the purely data-centric counterparts [Keane and Petruzzelli, 2000, Keane and Petruzzelli, 2000, Lim et al, 2008.…”
Section: Introductionmentioning
confidence: 99%