We demonstrate how to quantify the frequency-domain amplitude and phase accuracy of waveform models, $\delta A$ and $\delta \phi$, in a form that could be marginalized over in gravitational-wave inference using techniques currently applied for quantifying calibration uncertainty. For concreteness, waveform uncertainties affecting neutron-star inspiral measurements are considered, and post-hoc error estimates from a variety of waveform models are made by comparing time-domain and frequency-domain analytic models with multiple-resolution numerical simulations. 
These waveform uncertainty estimates can be compared to GW170817 calibration envelopes or to Advanced LIGO and Virgo calibration goals. Signal-specific calibration and waveform uncertainties are compared to statistical fluctuations in gravitational-wave observatories, giving frequency-dependent modeling requirements for detectors such as Advanced LIGO Plus, Cosmic Explorer, or Einstein Telescope. Finally, the distribution of waveform error for the GW170817 posterior is computed from tidal models and compared to the constraints on $\delta \phi$ or $\delta A$ from GWTC-1 by Edelman et. al. In general, $\delta \phi$ and $\delta A$ can also be interpreted in terms of unmodeled astrophysical energy transfer within or from the source system.