Supersonic gas turbulence is a ubiquitous property of the interstellar medium. The level of turbulence, quantified by the gas velocity dispersion (σg), is observed to increase with the star formation rate (SFR) of a galaxy, but it is yet not established whether this trend is driven by stellar feedback or gravitational instabilities. In this work we carry out hydrodynamical simulations of entire disc galaxies, with different gas fractions, to understand the origins of the SFR-σg relation. We show that disc galaxies reach the same levels of turbulence regardless of the presence of stellar feedback processes, and argue that this is an outcome of the way disc galaxies regulate their gravitational stability. The simulations match the SFR-σg relation up to SFRs of the order of tens of M⊙ yr−1 and $\sigma _{\rm g}\sim 50{\, \rm {km\, s^{-1}} }$ in neutral hydrogen and molecular gas, but fail to reach the very large values ($>100{\, \rm {km\, s^{-1}} }$) reported in the literature for rapidly star-forming galaxies. We demonstrate that such high values of σg can be explained by 1) insufficient beam smearing corrections in observations, and 2) stellar feedback being coupled to the ionised gas phase traced by recombination lines. Given that the observed SFR-σg relation is composed of highly heterogeneous data, with σg at high SFRs almost exclusively being derived from Hα observations of high redshift galaxies with complex morphologies, we caution against analytical models that attempt to explain the SFR-σg relation without accounting for these effects.