We study the Rayleigh–Taylor (RT) mixing layer, presenting simulations in agreement with experimental data. This problem is an idealized subproblem of important scientific and engineering problems, such as gravitationally induced mixing in oceanography and performance assessment for inertial confinement fusion. Engineering codes commonly achieve correct simulations through the calibration of adjustable parameters. In this sense, they are interpolative and not predictive. As computational science moves from the interpolative to the predictive and reduces the reliance on experiment, the quality of decision making improves. The diagnosis of errors in a multi-parameter, multi-physics setting is daunting, so we address this issue in the proposed idealized setting. The validation tests presented are thus a test for engineering codes, when used for complex problems containing RT features. The RT growth rate, characterized by a dimensionless but non-universal parameter
α
, describes the outer edge of the mixing zone. Increasingly accurate front tracking/large eddy simulations reveal the non-universality of the growth rate and agreement with experimental data. Increased mesh resolution allows reduction in the role of key subgrid models. We study the effect of long-wavelength perturbations on the mixing growth rate. A self-similar power law for the initial perturbation amplitudes is here inferred from experimental data. We show a maximum ±5% effect on the growth rate. Large (factors of 2) effects, as predicted in some models and many simulations, are inconsistent with the experimental data of Youngs and co-authors. The inconsistency of the model lies in the treatment of the dynamics of bubbles, which are the shortest-wavelength modes for this problem. An alternative theory for this shortest wavelength, based on the bubble merger model, was previously shown to be consistent with experimental data.