Shannon observed that the normal distribution has maximal entropy among distributions with a density function and a given variance. This sparked a significant body of research in statistics, broadly concerned with goodness-of-fit estimators based on Shannon entropy for a variety of distributions and, in particular, normality testing. The present paper proposes to use compression algorithms and other parsing-based entropy estimators to match samples in sampling order to one of a set of distributions with the observed μ and, where applicable, σ, using the distributions' quantile functions to convert the samples into a string of symbols for entropy estimation. The paper demonstrates with a series of Monte-Carlo simulations that the proposed technique may be able to distinguish between a number of common distributions even if the samples themselves are not i.i.d.