Tunnel magnetoresistance (TMR) sensors, known for their high sensitivity, efficiency, and compact size, are ideal for detecting weak currents, particularly leakage currents in smart grids. However, temperature variations can negatively impact their accuracy. This work investigates the effects of temperature variations on measurement accuracy. We analyzed the operating principles and temperature characteristics of TMR sensors and proposed a high-precision, software-based temperature compensation method using cubic spline interpolation combined with polynomial regression and zero-point self-calibration. Additionally, a field-programmable gate array (FPGA)-based temperature compensation circuit was designed and implemented. An experimental platform was established to comprehensively evaluate the sensor’s performance under various temperature conditions. Experimental results demonstrate that this method significantly enhances the sensor’s temperature stability, reduces the sensitivity temperature drift coefficient, and improves zero-point drift stability, outperforming other compensation methods. After compensation, the sensor’s measurement accuracy in complex temperature environments is substantially improved, enabling effective weak current detection in smart grids across diverse environments.