[Proceedings] 1992 IEEE International Conference on Systems, Man, and Cybernetics
DOI: 10.1109/icsmc.1992.271617
|View full text |Cite
|
Sign up to set email alerts
|

A statistical method for global optimization

Abstract: An algorithm for finding global optima using statistical prediction is presented. Assuming a random function model, lower confidence bounds on predicted values are used for sequential selection of evaluation points and as a convergence criterion. Performance comparision with published results on several test functions indicate that the procedure is very efficient in finding the global optimum of a multimodal function, and in terminating with relatively few evaluations. I . INTRODUCTIONcurrent minimum is not un… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
133
0

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 170 publications
(133 citation statements)
references
References 10 publications
0
133
0
Order By: Relevance
“…A variety of surrogate models, such as PRSM, kriging and its variants (GEK [47], cokriging [33], HK [34]), RBFs, ANN, SVR [22], etc., were implemented. A couple of infill-sampling criteria and dedicated constraint handling methods were implemented, such as minimizing surrogate predictor (MSP) [76], expected improvement [77,78], probability of improvement [5], mean-squared error (MSE) [79,80], lower-confidence bounding [51,81], target searching [74], and parallel infilling [30]. Some well-accepted and highly matured optimization algorithms, such as Hooke and Jeeves pattern search, Simplex, BFGS quasi-Newton's method, sequential quadratic programming (SQP), and single/multi objective genetic algorithms (GAs) [82], are employed to solve the suboptimization(s), in which the cost function(s) and constraint function(s) are evaluated by the cheap surrogate models.…”
Section: A Integration Into a Surrogate-based Optimization Codementioning
confidence: 99%
“…A variety of surrogate models, such as PRSM, kriging and its variants (GEK [47], cokriging [33], HK [34]), RBFs, ANN, SVR [22], etc., were implemented. A couple of infill-sampling criteria and dedicated constraint handling methods were implemented, such as minimizing surrogate predictor (MSP) [76], expected improvement [77,78], probability of improvement [5], mean-squared error (MSE) [79,80], lower-confidence bounding [51,81], target searching [74], and parallel infilling [30]. Some well-accepted and highly matured optimization algorithms, such as Hooke and Jeeves pattern search, Simplex, BFGS quasi-Newton's method, sequential quadratic programming (SQP), and single/multi objective genetic algorithms (GAs) [82], are employed to solve the suboptimization(s), in which the cost function(s) and constraint function(s) are evaluated by the cheap surrogate models.…”
Section: A Integration Into a Surrogate-based Optimization Codementioning
confidence: 99%
“…We distinguish between two scenarios in which the polling algorithm as described thus far must be adjusted to avoid violating the (n − 1)-dimensional boundaries 5 of the feasible domain. In the first scenario, the CMP is relatively far (that is, greater than ρ n but less than 2ρ n ) from the boundary of the feasible domain, and thus one or more of the poll points as determined by one of the algorithms proposed in §II-B might land slightly outside this boundary.…”
Section: Implementation Of Feasible Domain Boundariesmentioning
confidence: 99%
“…To achieve this, consider the minimization of J(x) =f (x) − c · s 2 (x), where c is some constant (see Cox &John 1997 andJones 2001). A search coordinated by this function will tend to explore regions of parameter space where both the predictor of the function value is relatively low and the uncertainty of this prediction in the Kriging model is relatively high.…”
Section: A Review Of Global Optimization Strategies Leveraging Krmentioning
confidence: 99%
“…In addition, there is the interval estimation (IE) policy (Kaelbling 1993), which combines exploration with exploitation by measuring the compound for which a particular linear combination of the posterior mean and posterior standard deviation is largest. IE was applied to BGO in Cox and John (1997), where it was called SDO, or sequential design for optimization.…”
Section: Other Related Measurement Policiesmentioning
confidence: 99%