Context. Explaining the currently observed magnetic fields in galaxies requires relatively strong seeding in the early Universe. One of the current theories proposes that magnetic seeds of the order of µG were expelled by supernova (SN) explosions after primordial, nG or weaker fields were amplified in stellar interiors. Aims. In this work, we take a closer look at this theory and calculate the maximum magnetic energy that can be injected in the interstellar medium by a stellar cluster of mass M cl based on what is currently known about stellar magnetism. Methods. We consider early-type stars and adopt either a Salpeter or a top-heavy IMF. For their magnetic fields, we adopt either a Gaussian or a bimodal distribution. The Gaussian model assumes that all massive stars are magnetized with 10 3 < B * < 10 4 G, while the bimodal, consistent with observations of Milky Way stars, assumes only 5 − 10 per cent of OB stars have 10 3 < B * < 10 4 G, while the rest have 10 < B * < 10 2 G. We ignore the effect of magnetic diffusion and assume no losses of magnetic energy. Results. We find that the maximum magnetic energy that can be injected by a stellar population is between 10 −10 −10 −7 times the total SN energy. The highest end of these estimates is about five orders of magnitude lower than what is usually employed in cosmological simulations, where about 10 −2 of the SN energy is injected as magnetic. Conclusions. Pure advection of the stellar magnetic field by SN explosions is a good candidate for seeding a dynamo, but not enough to magnetize galaxies. Assuming SNe as main mechanism for galactic magnetization, the magnetic field cannot exceed an intensity of 10 −7 G in the best-case scenario for a population of 10 5 solar masses in a superbubble of 300 pc radius, while more typical values are between 10 −10 − 10 −9 G. Therefore, other scenarios for galactic magnetization at high redshift need to be explored.