A well-known drawback of orthogonal frequencydivision multiplexing (OFDM) is its signal with high peakto-average power ratio (PAPR). Among a number of PAPR reduction techniques, clipping and filtering (CAF) is the simplest approach, which effectively reduces the PAPR of band-limited OFDM signals at the cost of increasing in-band distortion. In order to mitigate the performance degradation caused by the inband distortion, several iterative distortion recovery techniques have been proposed in the literature, and they are largely classified into time-domain (TD) and frequency-domain (FD) compensation approaches: The former and latter are represented by the decision-aided reconstruction (DAR) and clipping noise cancellation (CNC), respectively. To date, however, their theoretical performance limits have not been studied. In this work, we revisit the performance limits of CAF and derive a closed-form signal-to-distortion power ratio (SDR) expression. Furthermore, we introduce a time-domain distortion model for characterizing the OFDM signal with CAF, based on which we make performance comparison between the two compensation approaches. Theoretical analysis and simulations in terms of their achievable symbol error rate (SER) reveal that, unlike the FD counterpart, the TD compensation may suffer from unrecoverable distortion when filtering after clipping is applied at the transmitter.