It is shown that the thermal ionization energy of Mg acceptors in GaN, as determined by temperature dependent Hall effect measurements, exhibits the usual dependence on the concentration of ionized impurities, as seen in many other semiconductors. The observed difference in the thermal and optical ionization energies of Mg acceptors can be quantitatively understood based on a simple electrostatic interaction model.