Summary. We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of speci®c applications. However, in any speci®c application the values of necessary parameters may be unknown. In this case, physical observations of the system in the speci®c context are used to learn about the unknown parameters. The process of ®tting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc ®tting, and after calibration the model is used, with the ®tted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the ®tted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-®tting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.