High-resolution multi-channel digitizers are used extensively for precision low voltage measurements in numerous applications and allow the simultaneous measurement of voltage magnitude ratio and phase difference between two different waveforms in power system applications. Delta–sigma-based analog-to-digital conversion enables the use of sampling frequencies in the range of megahertz, which provides accurate measurement bandwidths for transformed high-frequency, high-voltage signals. With the increased use of power electronic converters contributing to high-frequency harmonic emissions in power systems, there is a growing interest in developing calibration systems to measure voltage ratio and phase difference of distorted fundamental frequency waveforms consisting of superimposed, high-frequency harmonics. However, information regarding the accuracy of the high-resolution digitizers in the measurement of distorted voltage waveforms is limited as characterization is typically performed under sinusoidal voltage waveform conditions. This paper presents the details of the accuracy characterization of a 24-bit resolution digitizer under both sinusoidal and distorted waveform conditions for measuring complex voltage ratio and phase error for frequencies up to 10 kHz. The detailed experimental results and the measurement uncertainty evaluations show that increased voltage ratio and phase difference errors should be allocated when these high-resolution digitizers are used to measure distorted voltage waveforms. The estimated expanded uncertainties of complex voltage ratio measurement and phase error measurement for harmonic frequencies up to 10 kHz are ±260 ppm and ±100 µrad, respectively.