Abstract. Adaptive time-domain equalizer (TDE) is an important module for digital optical coherent receivers. From an implementation perspective, we analyze and compare in detail the effects of error signal feedback delay on the convergence performance of TDE using either least-mean square (LMS) or constant modulus algorithm (CMA). For this purpose, a simplified theoretical model is proposed based on which iterative equations on the mean value and the variance of the tap coefficient are derived with or without error signal feedback delay for both LMS-and CMA-based methods for the first time. The analytical results show that decreased step size has to be used for TDE to converge and a slower convergence speed cannot be avoided as the feedback delay increases. Compared with the data-aided LMS-based method, the CMA-based method has a slower convergence speed and larger variation after convergence. Similar results are confirmed using numerical simulations for fiber dispersive channels. As the step size increases, a feedback delay of 20 clock cycles might cause the TDE to diverge. Compared with the CMA-based method, the LMS-based method has a higher tolerance on the feedback delay and allows a larger step size for a faster convergence speed.