Long-haul high speed optical transmission systems are significantly distorted by the interplay between the electronic chromatic dispersion (CD) equalization and the local oscillator (LO) laser phase noise, which leads to an effect of equalization enhanced phase noise (EEPN). The EEPN degrades the performance of optical communication systems severely with the increment of fiber dispersion, LO laser linewidth, symbol rate, and modulation format. In this paper, we present an analytical model for evaluating the performance of bit-error-rate (BER) versus signal-to-noise ratio (SNR) in the n-level