Orthogonal frequency division multiplexing (OFDM) has been adopted in many modern communication systems due to its robustness against frequency-selective fading channels as well as its near-rectangular spectrum that can achieve high spectral efficiency. However, its major drawback is the resulting signal with high peak-to-average power ratio (PAPR), which causes severe nonlinear distortion at the power amplifier (PA) unless input backoff is chosen sufficiently large. The effect of the nonlinear distortion is two-fold: out-of-band radiation and signal quality degradation. The former causes adjacent channel interference and thus degrades the bandwidth efficiency. The latter affects the system level performance and is often measured by the error vector magnitude (EVM). It is thus important for the system designer to analyze the nonlinear distortion caused by a given PA in terms of power spectral density (PSD) and EVM, but accurate calculation of these characteristics may be generally involved. In this work, by establishing the link between the cross-correlation coefficient of the input and output signals from PA and the resulting PSD, we characterize the in-band and out-of-band distortion of nonlinearly amplified OFDM signals based exclusively on the cross-correlation coefficient. The accuracy of the proposed approach is confirmed by both simulation and measurement using a real PA.