In this study the influence of the digital resolution on the properties of a data sample is experimentally determined in mass measurements. A mass comparator with adjustable digital resolution interval, also known as the quantization step size, was used in well controlled repeatable conditions. The same measurement procedure with the resolution differing by up to a factor of 200 was repeatedly carried out, and at least 150 mass differences were recorded with every resolution setting. A clear relationship was observed between the digital resolution and the type of random process characteristics for the data sample. Analysis shows that a white noise process dominates for data sets measured with the smallest digital resolution step from 0.001 mg and up to 0.005 mg. For resolutions from 0.01 mg to 0.2 mg random walk noise is observed for which variance of the sample mean can increase proportionally with sample size. We demonstrate that instrumental resolution is a strictly limiting factor in mass measurements only for the data sets with significant positive correlations, such as those having random walk noise. Otherwise for the white noise process, the smallest possible variance is inversely proportional to averaging time, like in time-frequency metrology, and not limited by the instrumental resolution. Our measurement results show that in this case sample standard deviation of the mean (0.00003 mg) can be more than ten times smaller than that of a single result (0.0005 mg) or the Type B component of the digital resolution (0.00029 mg).