This paper presents a novel procedure for detecting offset, gain, and timing skew errors in foreground calibration of time-interleaved analog-to-digital converters (TI-ADCs). An efficient peak detection scheme has been introduced, which is successfully employed to extract timing skew error. The designed method has also been extended to derive the gain and offset errors separately to form the generalized architecture. The objective is performed with the adjustment of the rational coefficient for each error. Simplicity and low hardware consumption will constitute the notable advantages of the proposed algorithm. Mathematical expressions have been provided to explain the principles.A special case in which all triple errors could exist together has also been analyzed. Then, the proposed architecture was implemented at the system level. Finally, the behavioral simulation results for a 12-bit four-channel TI-ADC have been demonstrated to confirm the accuracy of the proposed algorithm. The simulation environment has consisted of the sampling frequency of 8 GHz for TI-ADC and 1 GHz single-tone input frequency for each channel (associated with the Gaussian noise). Based on the results, the maximum value of 77.21 dB for SFDR has been achieved after the calibration process when the average error of 9% for timing skew and 6% tolerance for gain were considered for the channels of TI-ADC.