This paper introduces an on-chip analog calibration method tailored for differential temperature sensors in thermal monitoring applications. A three-step calibration process is proposed within a two-stage high-gain instrumentation amplifier to compensate for the output voltage offset due to device mismatches and on-chip temperature gradients. The calibration circuits were designed in a standard 65 nm CMOS process for simulation. Results indicate that an input-referred offset with a mean of 0.2 μV can be achieved after calibration, through which the standard deviation is greatly reduced from σ = 880.3 to σ = 5086 μV. Furthermore, the proposed analog offset calibration scheme has negligible impact on the sensitivity of the complete temperature sensor circuit, as shown by Monte Carlo and process-temperature corner simulation results.