This correspondence presents an analysis of the finite register length influence on the accuracy of results obtained by the time-frequency distributions (TFD's). In order to measure quality of the obtained results, the variance of the proposed model is found, signal-to-quantization noise ratio (SNR) is defined, and appropriate expressions are derived. Floating-and fixed-point arithmetic are considered, with the analysis of discrete random and discrete deterministic signals. It is shown that commonly used reduced interference distributions (RID's) exhibit similar performance with respect to the SNR. We have also derived the expressions establishing the relationship between the number of bits and the required quality of representation (which is defined by the SNR), which may be used for register-length design in hardware implementation of time-frequency algorithms.