The effects of subgrid cloud variability on grid-average microphysical rates and radiative fluxes are examined by use of long-term retrieval products at the Tropical West Pacific, Southern Great Plains, and North Slope of Alaska sites of the Department of Energy's Atmospheric Radiation Measurement program. Four commonly used distribution functions, the truncated Gaussian, Gamma, lognormal, and Weibull distributions, are constrained to have the same mean and standard deviation as observed cloud liquid water content. The probability density functions are then used to upscale relevant physical processes to obtain grid-average process rates. It is found that the truncated Gaussian representation results in up to 30% mean bias in autoconversion rate, whereas the mean bias for the lognormal representation is about 10%. The Gamma and Weibull distribution function performs the best for the grid-average autoconversion rate with the mean relative bias less than 5%. For radiative fluxes, the lognormal and truncated Gaussian representations perform better than the Gamma and Weibull representations. The results show that the optimal choice of subgrid cloud distribution function depends on the nonlinearity of the process of interest, and thus, there is no single distribution function that works best for all parameterizations. Examination of the scale (window size) dependence of the mean bias indicates that the bias in grid-average process rates monotonically increases with increasing window sizes, suggesting the increasing importance of subgrid variability with increasing grid sizes.