Radar systems have been widely employed to measure precipitation and predict flood risks. However, radar as a rainfall measuring device and the produced rainfall estimate contain uncertainties and errors resulting from sources such as mis-calibration, beam blockage, anomalous propagation, and ground clutter. Previously, these radar errors have been individually studied. However, in practical applications, separating and estimating these errors are not possible. In the current study, to analyze the effects of radar rainfall errors, especially for their effect on the peak discharge, through a synthetic runoff simulation, a spatial error model based on univariate Gaussian random numbers was employed. Furthermore, a Monte Carlo simulation, one of the most widely used techniques for intensive simulation toward obtaining practical results, was performed. The results indicated that the variability of the peak discharge increases as the assumed true rainfall increases. In addition, the higher standard deviation of the tested radar rainfall error leads to a higher peak discharge bias. To investigate the cause of this bias, an additional simulation was performed. This simulation revealed that the regression line for the peak discharge corresponding to rainfall amount increases quadratically. The results show that the higher bias is a result of the higher deviation of peak discharges in the cells, with a greater than mean rainfall, even with the same number of cells for lower and higher rainfall amounts.