A tight relation between the [C ii] 158 $\mu$m line luminosity and star formation rate is measured in local galaxies. At high redshift (z > 5), though, a much larger scatter is observed, with a considerable (15–20 per cent) fraction of the outliers being [C ii]-deficient. Moreover, the [C ii] surface brightness ($\Sigma_{\rm [C\, \small {II}]}$) of these sources is systematically lower than expected from the local relation. To clarify the origin of such [C ii]-deficiency, we have developed an analytical model that fits local [C ii] data and has been validated against radiative transfer simulations performed with cloudy. The model predicts an overall increase of $\Sigma_{\rm [C\, \small {II}]}$ with ΣSFR. However, for ΣSFR ${\gtrsim} 1 \, \mathrm{M}_\odot \,{\rm yr}^{-1}\,{\rm kpc}^{-2}$, $\Sigma_{\rm [C\, \small {II}]}$ saturates. We conclude that underluminous [C ii] systems can result from a combination of three factors: (a) large upward deviations from the Kennicutt–Schmidt relation (κs ≫ 1), parametrized by the ‘burstiness’ parameter κs; (b) low metallicity; (c) low gas density, at least for the most extreme sources (e.g. CR7). Observations of [C ii] emission alone cannot break the degeneracy among the above three parameters; this requires additional information coming from other emission lines (e.g. [O iii]88 $\mu$m, C iii]1909 Å, CO lines). Simple formulae are given to interpret available data for low- and high-z galaxies.