In this paper we present a theory of the bit error rate (BER) of Euclidean metric-based maximum likelihood sequence detectors (EM-MLSD) in the presence of channel mismatch caused by nongaussian noise. Although the theory is general, here we focus on the effects of quantization noise (QN) added by the front-end analog-to-digital converter (ADC) typically used in DSP based implementations of the receiver. Numerical results show a close agreement between the predictions of the theoretical analysis and computer simulations. As a practical application of the proposed theory, we investigate the performance of EM-MLSD in 10Gb/s Ethernet receivers for multimode optical fibers [1]. Since the BER required in this application is below 10 −12 , which precludes the use of computer simulations to estimate BER, a theoretical study of the MLSD performance including the combined effects of the channel dispersion and QN, becomes necessary. We present numerical results for the three stressors specified by the 10GBASE-LRM standard. Our study shows that the impact of the QN added by the ADC on the performance depends strongly on the channel dispersion (i.e., the stressor).