2014
DOI: 10.1002/2013wr014767
|View full text |Cite
|
Sign up to set email alerts
|

Quantifying the predictive consequences of model error with linear subspace analysis

Abstract: All computer models are simplified and imperfect simulators of complex natural systems. The discrepancy arising from simplification induces bias in model predictions, which may be amplified by the process of model calibration. This paper presents a new method to identify and quantify the predictive consequences of calibrating a simplified computer model. The method is based on linear theory, and it scales efficiently to the large numbers of parameters and observations characteristic of groundwater and petroleu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
78
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 73 publications
(82 citation statements)
references
References 37 publications
4
78
0
Order By: Relevance
“…Oversimplification of a model can be just as detrimental to its forecasts as the widely recognized issue of being too complex (Hunt et al ; Doherty and Christensen ). Worse, inappropriate parameter simplification can lead to non‐trivial and undetectable forecast biases (White et al ) because overly simple models limit the ability of information residing within the observations to inform and constrain the parameters in appropriate and meaningful ways. In our model, the use of one K zone for 16 head observations results in a parameter estimation problem where the observations can “speak” (inform parameters) but the model does not have the correct means to “hear” (Doherty and Hunt ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Oversimplification of a model can be just as detrimental to its forecasts as the widely recognized issue of being too complex (Hunt et al ; Doherty and Christensen ). Worse, inappropriate parameter simplification can lead to non‐trivial and undetectable forecast biases (White et al ) because overly simple models limit the ability of information residing within the observations to inform and constrain the parameters in appropriate and meaningful ways. In our model, the use of one K zone for 16 head observations results in a parameter estimation problem where the observations can “speak” (inform parameters) but the model does not have the correct means to “hear” (Doherty and Hunt ).…”
Section: Resultsmentioning
confidence: 99%
“…In real‐world applied modeling, it is not as easy to determine how parameter surrogacy affects model forecasts, and different forecast types will be affected to different degrees. However, strategies are available to minimize detrimental effects of model structural error (e.g., Doherty and Welter ; White et al ). In any event, the pilot‐point parameterization yielded the best result of approaches tried (as evidenced by the lower forecast RMSE), and was clearly superior to the 1988 manual calibration.…”
Section: Resultsmentioning
confidence: 99%
“…The calibration constrained subspace uncertainty analysis method of Tonkin and Doherty (), often referred to as Null Space Monte Carlo (NSMC), has several limitations that could be addressed by more computationally intensive analysis methods: NSMC relies upon an assumption of model linearity about a calibrated model produced by local optimization techniques (PEST). So while it is computationally efficient, it may not be robust in the presence of model nonlinearity, that can produce multiple local minima (Mosegaard and Tarantola ). Due to limitations on what aspects of a model can be represented as an adjustable parameter, and the increasing computational cost of additional parameterization, aspects of model structure, such as model geometry representing uncertain geological features, are fixed during the analysis potentially leading to bias and underprediction of predictive uncertainty (White et al ; Hermans et al ). To optimize model parameter values to reduce observation data misfit and represent models as linear approximations, the PEST/NSMC approach requires parameters that are smoothly varying and differentiable. However, often geology is categorical in nature with sharp contrasts in physical properties with facies changes that are impossible to represent with smooth differentiable parameters (Refsgaard et al ). …”
Section: Future Directions Of Groundwater Modelingmentioning
confidence: 99%
“…Previous research has shown that the subjective process of selecting which model inputs to treat as uncertain (e.g., parameterization) may affect uncertainty estimates in model outcomes (White et al, 2014). Herein, parameterization refers to the subjective and necessary process of selecting uncertain model inputs to treat as adjustable in the conditioning process.…”
Section: Introductionmentioning
confidence: 99%