Optical rectification with tilted pulse fronts in lithium niobate crystals is one of the most promising methods to generate terahertz (THz) radiation. In order to achieve higher optical-to-THz energy efficiency, it is necessary to cryogenically cool the crystal not only to decrease the linear phonon absorption for the generated THz wave but also to lengthen the effective interaction length between infrared pump pulses and THz waves. However, the refractive index of lithium niobate crystal at lower temperature is not the same as that at room temperature, resulting in the necessity to re-optimize or even re-build the tilted pulse front setup. Here, we performed a temperature dependent measurement of refractive index and absorption coefficient on a 6.0 mol% MgO-doped congruent lithium niobate wafer by using a THz time-domain spectrometer (THz-TDS). When the crystal temperature was decreased from 300 K to 50 K, the refractive index of the crystal in the extraordinary polarization decreased from 5.05 to 4.88 at 0.4 THz, resulting in ~1° change for the tilt angle inside the lithium niobate crystal. The angle of incidence on the grating for the tilted pulse front setup at 1030 nm with demagnification factor of −0.5 needs to be changed by 3°. The absorption coefficient decreased by 60% at 0.4 THz. These results are crucial for designing an optimum tilted pulse front setup based on lithium niobate crystals.