The interstellar medium in star-forming galaxies is a multiphase gas in which turbulent support is at least as important as thermal pressure. Sustaining this configuration requires continuous radiative cooling, such that the overall average cooling rate matches the decay rate of turbulent energy into the medium. Here we carry out a set of numerical simulations of a stratified, turbulently stirred, radiatively cooled medium, which uncover a fundamental transition at a critical one-dimensional turbulent velocity of ≈ 35 km/s. At turbulent velocities below ≈ 35 km/s, corresponding to temperatures below 10 5.5 K, the medium is stable, as the time for gas to cool is roughly constant as a function of temperature. On the other hand, at turbulent velocities above the critical value, the gas is shocked into an unstable regime in which the cooling time increases strongly with temperature, meaning that a substantial fraction of the interstellar medium is unable to cool on a turbulent dissipation timescale. This naturally leads to runaway heating and ejection of gas from any stratified medium with a one-dimensional turbulent velocity above ≈ 35 km/s, a result that has implications for galaxy evolution at all redshifts.