The far-infrared -radio correlation connects star formation and magnetic fields in galaxies, and has been confirmed over a large range of far-infrared / radio luminosities, both in the local Universe and even at redshifts of z ∼ 2. Recent investigations indicate that it may even hold in the regime of local dwarf galaxies, and we therefore explore here the expected behavior in the regime of star formation surface densities below 0.1 M⊙ kpc −2 yr −1 . We derive two conditions that can be particularly relevant for inducing a change in the expected correlation: a critical star formation surface density to maintain the correlation between star formation rate and the magnetic field, and a critical star formation surface density below which cosmic ray diffusion losses dominate over their injection via supernova explosions. For rotation periods shorter than 1.5 × 10 7 (H/kpc) 2 yrs, with H the scale height of the disk, the first correlation will break down before diffusion losses are relevant, as higher star formation rates are required to maintain the correlation between star formation rate and magnetic field strength. For high star formation surface densities ΣSFR, we derive a characteristic scaling of the non-thermal radio to the far-infrared / infrared emission with Σ 1/3 SFR , corresponding to a scaling of the non-thermal radio luminosity Ls with the infrared luminosity L th as L 4/3 th . The latter is expected to change when the above processes are no longer steadily maintained. In the regime of long rotation periods, we expect a transition towards a steeper scaling with Σ 2/3 SFR , implying Ls ∝ L
5/3th , while the regime of fast rotation is expected to show a considerably enhanced scatter, as a well-defined relation between star formation and magnetic field strength is not maintained. The scaling relations above explain the increasing thermal fraction of the radio emission observed within local dwarfs, and can be tested with future observations by LOFAR as well as the SKA and its precursor radio telescopes.