2014
DOI: 10.1007/s10898-014-0210-2
|View full text |Cite
|
Sign up to set email alerts
|

A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 62 publications
(26 citation statements)
references
References 19 publications
0
26
0
Order By: Relevance
“…The aggregation method combines the objective functions into a scalar objective function that is used in a SOO context, thus producing one single compromise solution. To obtain an approximation to the , the SOO method must be run as many times as the desired number of points using different sets of weights vector [ 16 ].…”
Section: Multi-objective Optimization Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…The aggregation method combines the objective functions into a scalar objective function that is used in a SOO context, thus producing one single compromise solution. To obtain an approximation to the , the SOO method must be run as many times as the desired number of points using different sets of weights vector [ 16 ].…”
Section: Multi-objective Optimization Approachmentioning
confidence: 99%
“…Multi-objective evolutionary algorithms based on decomposition (recognized in the scientific literature as MOEA/D) decompose a MOO problem into a number of scalar optimization subproblems (using a weight-dependent scalar aggregation function) and optimize them simultaneously [ 14 – 16 ]. The subproblems are simultaneously solved by handling a population of solutions that comprise the best solutions found so far for each subproblem.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, in this paper, the above process is carried out iteratively. In each iteration, to balance the global exploration and local exploitation (Zhou et al 2007;Feng et al 2015), two new points, including the optimum of Eq. (29) and the point at which the samples are most sparse in the design space, are evaluated with the FEM and then added to the training sample set to gradually improve the three surrogate models until the optimization converges.…”
Section: T C C H a B W T C C H A B T C C H A B S T C C Hmentioning
confidence: 99%
“…The bi-objective optimization problem (20) minimizes the predicted objective function value and maximizes the distance to the previously evaluated solutions. For this problem, we can employ well-developed multi-objective evolutionary algorithms (MOEAs), such as non-dominated sorting genetic algorithm -II (NSGA-II) [44] and the multiobjective evolutionary algorithm based on decomposition (MOEA/D) [45]- [47], to get the non-dominated front (NF) and corresponding non-dominated set (NS). In this paper, we choose NSGA-II since the magnitudes of these two objectives are different.…”
Section: ) Batch Infill Sampling Criterionmentioning
confidence: 99%