Electron microprobe-based quantitative compositional measurement of first-row transition metals using their L
$\alpha$
X-ray lines is hampered by, among other effects, self-absorption. This effect, which occurs when a broad X-ray line is located close to a broad absorption edge, is not accounted for by matrix corrections. To assess the error due to neglecting self-absorption, we calculate the L
$\alpha$
X-ray intensity emitted from metallic Fe, Ni, Cu, and Zn targets, assuming a Lorentzian profile for the X-ray line and taking into account the energy dependence of the mass absorption coefficient near the absorption edge. We find that calculated X-ray intensities depart increasingly, for increasing electron beam energy, from those obtained assuming a narrow X-ray line and a single fixed absorption coefficient (conventional approach), with a maximum deviation of
$\sim$
15% for Ni and of
$\sim$
10% for Fe. In contrast, X-ray intensities calculated for metallic Zn and Cu do not differ significantly from those obtained using the conventional approach. The implications of these results for the analysis of transition-metal compounds by electron probe microanalysis as well as strategies to account for self-absorption effects are discussed.