Because large nonlinearity errors exist in the current tapped-delay line (TDL) style field programmable gate array (FPGA)-based time-to-digital converters (TDC), bin-by-bin calibration techniques have to be resorted for gaining a high measurement resolution. If the TDL in selected FPGAs is significantly affected by changes in ambient temperature, the bin-by-bin calibration table has to be updated as frequently as possible. The on-line calibration and calibration table updating increase the TDC design complexity and limit the system performance to some extent. This paper proposes a method to minimize the nonlinearity errors of TDC bins, so that the bin-by-bin calibration may not be needed while maintaining a reasonably high time resolution. The method is a two pass approach: By a bin realignment, the large number of wasted zero-width bins in the original TDL is reused and the granularity of the bins is improved; by a bin decimation, the bin size and its uniformity is traded-off, and the time interpolation by the delay line turns more precise so that the bin-by-bin calibration is not necessary. Using Xilinx 28 nm FPGAs, in which the TDL property is not very sensitive to ambient temperature, the proposed TDC achieves approximately 15 ps root-mean-square (RMS) time resolution by dual-channel measurements of time-intervals over the range of operating temperature. Because of removing the calibration and less logic resources required for the data post-processing, the method has bigger multi-channel capability.