Abstract. We present a method for computing reduced-order models of parameterized partial differential equation solutions. The key analytical tool is the singular value expansion of the parameterized solution, which we approximate with a singular value decomposition of a parameter snapshot matrix. To evaluate the reduced-order model at a new parameter, we interpolate a subset of the right singular vectors to generate the reduced-order model's coefficients. We employ a novel method to select this subset that uses the parameter gradient of the right singular vectors to split the terms in the expansion yielding a mean prediction and a prediction covariance-similar to a Gaussian process approximation. The covariance serves as a confidence measure for the reduce order model.We demonstrate the efficacy of the reduced-order model using a parameter study of heat transfer in random media. The high-fidelity simulations produce more than 4TB of data; we compute the singular value decomposition and evaluate the reduced-order model using scalable MapReduce/Hadoop implementations. We compare the accuracy of our method with a scalar response surface on a set of temperature profile measurements and find that our model better captures sharp, local features in the parameter space. Key words. model reduction, simulation informatics, MapReduce, Hadoop, tall and skinny SVD 1. Introduction & motivation. High-fidelity simulations of partial differential equations are typically too expensive for design optimization and uncertainty quantification, where many independent runs are necessary. Cheaper reduced-order models (ROMs) that approximate the map from simulation inputs to quantities of interest may replace expensive simulations to enable such parameter studies. These ROMs are constructed with a relatively small set of high-fidelity runs chosen to cover a range of input parameter values. Each evaluation of the ROM is a linear combination of basis functions derived from the outputs of the high-fidelity runs; ROM constructions differ in their choice of basis functions and method for computing the coefficients of the linear combination. Projection-based methods project the residual of the governing equations (i.e., a Galerkin projection) to create a relatively small system of equations for the coefficients; see the recent preprint [4] for a survey of projection-based techniques. Alternatively, one may derive a closely related optimization problem to compute the coefficients [9,10,14]. These two formulations can provide a measure of confidence along with the ROM. However, they are often difficult to implement in existing solvers since they need access to the equation's operators or residual.To bypass the implementation difficulties, one may use response surfaces-e.g., collocation [3,32] or Gaussian process regression [29]-which are essentially interpolation methods applied to the high-fidelity outputs. They do not need access to the differential operators or residuals and are therefore relatively easy to implement. However, measures of co...