Many massive stars appear to undergo enhanced mass-loss during late stages of their evolution. In some cases, the ejected mass likely originates from non-terminal explosive outbursts, rather than continuous winds. Here we study the dependence of the ejecta mass, mej, on the energy budget E of an explosion deep within the star, using both analytical arguments and numerical hydrodynamics simulations. Focusing on polytropic stellar models, we find that for explosion energies smaller than the stellar binding energy, the ejected mass scales as $m_{\rm ej} \propto E^{\varepsilon _{\rm m}}$, where εm = 2.4–3.0 depending on the polytropic index. The loss of energy due to shock breakout emission near the stellar edge leads to the existence of a minimal mass-shedding explosion energy, corresponding to a minimal ejecta mass. For a wide range of progenitors, from Wolf–Rayet stars to red supergiants (RSGs), we find a similar limiting energy of $E_{\rm min} \approx 10^{46}\!-\!10^{47} \rm \, erg$, almost independent of the stellar radius. The corresponding minimal ejecta mass varies considerably across different progenitors, ranging from ${\sim } 10^{-8} \, \rm M_\odot$ in compact stars, up to ${\sim } 10^{-2} \, \rm M_\odot$ in RSGs. We discuss implications of our results for pre-supernova outbursts driven by wave heating, and complications caused by the non-constant opacity and adiabatic index of realistic stars.