The finite time, τ dep , over which positrons from β + decays of 56 Co deposit energy in type Ia supernovae ejecta lead, in case the positrons are trapped, to a slower decay of the bolometric luminosity compared to an exponential decline. Significant light-curve flattening is obtained when the ejecta density drops below the value for which τ dep equals the 56 Co lifetime. We provide a simple method to accurately describe this "delayed deposition" effect, which is straightforward to use for analysis of observed light curves. We find that the ejecta heating is dominated by delayed deposition typically from 600 to 1200 day, and only later by longer lived isotopes 57 Co and 55 Fe decay (assuming solar abundance). For the relatively narrow 56 Ni velocity distributions of commonly studied explosion models, the modification of the light curve depends mainly on the 56 Ni mass-weighted average density, ρ t 3 . Accurate late-time bolometric light curves, which may be obtained with JWST far-infrared (far-IR) measurements, will thus enable to discriminate between explosion models by determining ρ t 3 (and the 57 Co and 55 Fe abundances). The flattening of light curves inferred from recent observations, which is uncertain due to the lack of far-IR data, is readily explained by delayed deposition in models with ρ t 3 ≈ 0.2 M ⊙ (10 4 km s −1 ) −3 , and does not imply supersolar 57 Co and 55 Fe abundances.