Cold, neutral interstellar gas, the reservoir for star formation, is traced through the absorption of the 21-centimetre continuum radiation by neutral hydrogen (H I). Although detected in one hundred cases in the host galaxies of distant radio sources, only recently have column densities approaching the maximum value observed in Lyman-a absorption systems (NHI ∼ 1022 cm−2) been found. Here we explore the implications these have for the hypothesis that the detection rate of H I absorption is dominated by ionising photon rate from the active galactic nucleus (AGN). We find, with the addition all of the current searches for H I absorption at z ≥0.1, a strong correlation between the H I absorption strength and the ionising photon rate, with the maximum value at which H I is detected (QHI = 2.9 ×1056 ionising photons s−1) remaining close to the theoretical value in which all of the neutral gas would be ionised in a large spiral galaxy. We also rule out other effects (excitation by the radio continuum and changing gas properties), as the dominant cause for the decrease in detection rate with redshift. Furthermore, from the maximum theoretical column density, we find that the five high column density systems have spin temperatures close to those of the Milky Way (Tspin ≲ 300 K), whereas, from our model of a gaseous galactic disk, the H I detection at QH I = 2.9 ×1056 s−1 yields Tspin ∼ 10 000 K, consistent with the gas being highly ionised.