The phase and ratio errors of transducers for distribution electrical grids can be measured, in a wide frequency range, by a synchronized couple of digitizers Agilent 3458A working in DCV mode. The high metrological performances of the digitizers is degraded by the input low pass filter cutoff frequency, which depends on the selected range. This paper proposes a technique to identify and model, for each range, the filter complex function up to several tens of kilohertz. First tests show its potentiality in reducing the errors introduced by the digitizers, when set on different ranges, in frequency calibration of voltage/current transducers.