This paper introduces a backpropagation-based technique for the calibration of the mismatch errors of timeinterleaved analog to digital converters (TI-ADCs). This technique is applicable to digital receivers such as those used in coherent optical communications. The error at the slicer of the receiver is processed using a modified version of the well known backpropagation algorithm from machine learning. The processed slicer error can be directly applied to compensate the TI-ADC mismatch errors with an adaptive equalizer, or it can be used to digitally estimate and correct said mismatch errors using analog techniques such as delay cells and programmable gain amplifiers (PGA). The main advantages of the technique proposed here compared to prior art are its robustness, its speed of convergence, and the fact that it always works in background mode, independently of the oversampling factor and the properties of the input signal, as long as the receiver converges. Moreover, this technique enables the joint compensation of impairments not addressed by traditional TI-ADC calibration techniques, such as I/Q skew in quadrature modulation receivers. Simulations are presented to demonstrate the effectiveness of the technique, and low complexity implementation options are discussed.