We argue that the difference in infrared-to-radio luminosity ratio between local and high-redshift star-forming galaxies reflects the alternative physical conditions ---including magnetic field configurations--- of the dominant population of star-forming galaxies in different redshift ranges. We define three galactic types, based on our reference model, with reference to ages of stellar populations. ``Normal'' late-type galaxies dominate the star formation in the nearby Universe; ``starburst'' galaxies take over at higher redshifts, up to $z 1.5$; while ``protospheroidal'' galaxies dominate at high redshift. A reanalysis of data from the COSMOS field combined with literature results shows that, for each population, the data are consistent with an almost redshift-independent mean value of the parameter IR $, which quantifies the infrared--radio correlation. However, we find a hint of an upturn of the mean $q_ IR $ at $z 3.5$ consistent with the predicted dimming of synchrotron emission due to cooling of relativistic electrons by inverse Compton scattering off the cosmic microwave background. The typical stellar masses increase from normal, to starburst, and to protospheroidal galaxies, accounting for the reported dependence of the mean IR $ on stellar mass. Higher values of $q_ IR $ found for high-$z$ strongly lensed dusty galaxies selected at $500\ might be explained by differential magnification.