The part played by stars in the ionization of the intergalactic medium (IGM) remains an open question. A key issue is the proportion of the stellar ionizing radiation that escapes the galaxies in which it is produced. Spectroscopy of gamma-ray burst (GRB) afterglows can be used to determine the neutral hydrogen column-density, N H i , in their host galaxies and hence the opacity to extreme ultra-violet (EUV) radiation along the lines-of-sight to the bursts. Thus, making the reasonable assumption that longduration GRB locations are representative of the sites of massive stars that dominate EUV production, one can calculate an average escape fraction of ionizing radiation in a way that is independent of galaxy size, luminosity or underlying spectrum. Here we present a sample of N H i measures for 138 GRBs in the range 1.6 < z < 6.7 and use it to establish an average escape fraction at the Lyman limit of f esc ≈ 0.005, with a 98% confidence upper limit of f esc ≈ 0.015. This analysis suggests that stars provide a small contribution to the ionizing radiation budget of the IGM at z < 5, where the bulk of the bursts lie. At higher redshifts, z > 5, firm conclusions are limited by the small size of the GRB sample (7/138), but any decline in average H i column-density seems to be modest. We also find no indication of a significant correlation of N H i with galaxy UV luminosity or host stellar mass, for the subset of events for which these are available. We discuss in some detail a number of selection effects and potential biases. Drawing on a range of evidence we argue that such effects, while not negligible, are unlikely to produce systematic errors (in either direction) of more than a factor ∼ 2 in f esc , and so would not affect the primary conclusions. Given that many GRB hosts are low metallicity, high specific star-formation rate, dwarf galaxies, these results present a particular problem for the hypothesis that such galaxies dominated the reionization of the universe.