A common problem of the virtualized cloud radio access network architecture (C-RAN) is the compression of the time-domain IQ samples before transmission over the fronthaul link. Considering a multicarrier waveform such as OFDM, whose IQ samples follow a quasi-Gaussian distribution, the conventional Gaussian quantizer may be used as the optimal solution to the compression problem. However, since the high peak-to-average power ratio (PAPR) of OFDM signals remains a serious problem, various techniques may be employed to reduce the time-domain fluctuations of the IQ samples in the OFDM, resulting in a change in its distribution. The latter fact makes the Gaussian quantizer suboptimal. The literature lacks a performance analysis of the conventional OFDM-based compression techniques when the PAPR of the OFDM signal is reduced. Therefore, in this paper, we study for the first time the impact of reducing the PAPR of the OFDM signal before compression in the C-RAN architecture through rate-distortion analysis. We consider clipping and tone reservation PAPR reduction algorithms. The former is the simplest PAPR reduction approach, while the latter is one of the most effective algorithms used in broadcasting standards such as DVB-T2 and ATSC 3.0. We first derive the distribution of the PAPR-reduced OFDM IQ samples. This is used to optimize the thresholds and codebook levels of a non-uniform scalar quantizer and the number of quantization bits allocated for each quantized level in the entropy coding stage, along with the MER performance analysis. The simulation results show that the conventional Gaussian-based compression techniques applied to a PAPR-reduced signal is not very robust to the statistical changes in the signal unless the signal distribution at the input of the Gaussian quantizer is not significantly affected. However, a significant gain is obtained when the quantizer is optimized with respect to the true distribution of the PAPR-reduced IQ samples.