Purpose:To investigate the accuracy of T* 2 liver iron quantification using different curve-fitting models under varying acquisition conditions, and to compare in iron-overloaded patients the reliability of rapid T* 2 measurements against approved and slower T 2 protocols.
Materials and Methods:Simulations were conducted to assess the influence of various factors on the accuracy of T* 2 measurement: curve-fitting model, signal-to-noise ratio (SNR), and echo time (TE) spacing. Fifty-four iron-overloaded pediatric patients were assessed using a standard T 2 and two variations of T* 2 acquisitions. In both simulations and in vivo data, three analysis models were evaluated: monoexponential, constant offset, and truncated.
Results:Simulations show the truncated model provides the best accuracy but is susceptible to underestimating high iron species under low SNR or high minimum TE. In contrast, the offset model tends to overestimate but maintains the most reliable measurements across the relevant range of iron levels. Furthermore, a much lower SNR can be tolerated if the acquisition uses a low minimum TE. In vivo results confirm theoretical findings and show that T* 2 measurements can be as reliable as those from approved and slower T 2 protocols.
Conclusion:Guidelines are provided on choosing an appropriate model under specific noise conditions and acquisition schemes to ensure accurate and rapid T* 2 liver iron quantification.