This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: firstly, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical Expected Improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Secondly, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material online.
Stepwise Uncertainty Reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this paper we introduce several multi-point sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code and datas are available online as supplementary materials.
We consider the problem of estimating the set of all inputs that leads a system to some particular behavior. The system is modeled with an expensive-to-evaluate function, such as a computer experiment, and we are interested in its excursion set, i.e. the set of points where the function takes values above or below some prescribed threshold. The objective function is emulated with Gaussian Process (GP) models based on an initial design of experiments enriched with evaluation results at (batch-) sequentially determined input points. The GP model provides conservative estimates for the excursion set, which control false positives while minimizing false negatives. We introduce adaptive strategies that sequentially select new evaluations of the function by reducing the uncertainty on conservative estimates. Following the Stepwise Uncertainty Reduction approach we obtain new evaluations by minimizing adapted criteria. Tractable formulae for the conservative criteria are derived which allow more convenient optimization. The method is benchmarked on random functions generated under the model assumptions in two and five dimensions and applied to a reliability engineering test case. Overall, the proposed strategy of minimizing false negatives in conservative estimation achieves competitive performance both in terms of model-based and model-free indicators.
Les outils de modélisation numérique, tel que le modèle hydraulique 1D Crue9 développé par la CNR, sont aujourd'hui couramment utilisés pour analyser les comportements hydrauliques et hydrologiques des cours d'eau. Ces outils reposent notamment sur un jeu de paramètres d'entrée, physiques ou numériques, qui comportent des incertitudes. L'outil Prométhée, développé par l'IRSN, permet de réaliser des propagations d'incertitudes et deux types d'analyses de sensibilité : une méthode déterministe (Morris), qui repose sur les méthodes dites de criblage, permet d'identifier les paramètres impactant la variation des sorties d'intérêts ; et une méthode probabiliste (FAST), qui repose sur l'analyse de la variance des sorties en fonction de la variance des facteurs d'entrée, et permet de hiérarchiser les paramètres d'entrée en fonction de la sensibilité des sorties. Les études de propagation d'incertitudes exigent la réalisation d'un grand nombre de calculs. Pour ce faire, le couplage Prométhée/Crue9 est utilisé : il instrumente les fichiers associés aux simulations hydrauliques, lance une multitude de calculs, et réalise un traitement des résultats en utilisant des outils statistiques. Cet outil couplé donne ainsi la possibilité d'accomplir des études de sensibilité, en particulier probabilistes, en paramétrant des modèles hydrauliques complexes faisant intervenir de nombreux paramètres.Numerical modeling tools, like Crue9 the 1D modeling software developed by CNR, are widely used to analyze hydraulic and hydrological behavior of rivers. Those tools are based on input parameters, with physical or numerical meaning; theses inputs are generally known with some uncertainties. The tool Prométhée, developed by IRSN, is able to realize uncertainties propagations, and two kinds of sensibility analysis: the first one, a determinist method (Morris) based on screening, is able to identify factors which influenced outputs variability; the second one, a probabilistic method (FAST) based on variance analysis of outputs regarding inputs variances, performs inputs ranking in function of outputs sensibilities. Uncertainties propagations studies require an important computational capacity; to do so; the Promethée/Crue9 coupling is used. The coupled tool is able to parameter Crue9 files for the hydraulic computations, to run lots of computation, and then to analyze results with statistic tools. This coupled tool gives the possibility to realize sensitivity studies by probabilistic method, to parameter realistic and complex model rivers, and to study the influence of several inputs variations.
International audienceIn many situations physical systems may be known to satisfy inequality constraints with respect to some or all input parameters. When building a surrogate model of this system (like in the framework of computer experiments7), one should integrate such expert knowledge inside the emulator structure. We proposed a new methodology to incorporate both equality conditions and inequality constraints into a Gaussian process emulator such that all conditional simulations satisfy the inequality constraints in the whole domain6. An estimator called mode (maximum a posteriori) is calculated and satisfies the inequality constraints.Herein we focus on the estimation of covariance hyper-parameters and cross validation methods1. We prove that these methods are suited to inequality constraints. Applied to real data in two dimensions, the numerical results show that the Leave-One-Out mean square error criterion using the mode is more efficient than the usual (unconstrained) Kriging mean
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.