The main goal of the minimum zone tolerance (MZT) method is to achieve the best estimation of the roundness error, but it is computationally intensive. This paper describes the application of a genetic algorithm (GA) to minimize the computation time in the evaluation of CMM roundness errors of a large cloud of sampled datapoints (0.2° equally spaced datapoints). Computational experiments have shown that by selecting the optimal GA parameters, namely a combination of the four genetic parameters related to population size, crossover, mutation, and stop conditions, the computation time can be reduced by up to one order of magnitude, allowing realtime operation. Optimization has been tested using seven CMM datasets, obtained from different machining features, and compared with the LSQ method. The performance of the optimized algorithm has been validated with GA from the literature using four benchmark datasets.