Summary
This work proposes a method for statistical effect screening to identify design parameters of a numerical simulation that are influential to performance while simultaneously being robust to epistemic uncertainty introduced by calibration variables. Design parameters are controlled by the analyst, but the optimal design is often uncertain, while calibration variables are introduced by modeling choices. We argue that uncertainty introduced by design parameters and calibration variables should be treated differently, despite potential interactions between the two sets. Herein, a robustness criterion is embedded in our effect screening to guarantee the influence of design parameters, irrespective of values used for calibration variables. The Morris screening method is utilized to explore the design space, while robustness to uncertainty is quantified in the context of info‐gap decision theory. The proposed method is applied to the National Aeronautics and Space Administration Multidisciplinary Uncertainty Quantification Challenge Problem, which is a black‐box code for aeronautic flight guidance that requires 35 input parameters. The application demonstrates that a large number of variables can be handled without formulating simplifying assumptions about the potential coupling between calibration variables and design parameters. Because of the computational efficiency of the Morris screening method, we conclude that the analysis can be applied to even larger‐dimensional problems. (Approved for unlimited, public release on October 9, 2013, LA‐UR‐13‐27839, Unclassified.) Copyright © 2015 John Wiley & Sons, Ltd.