The analytic information theory of discrete distributions was initiated in
1998 by C. Knessl, P. Jacquet and S. Szpankowski who addressed the precise
evaluation of the Renyi and Shannon entropies of the Poisson, Pascal (or
negative binomial) and binomial distributions. They were able to derive various
asymptotic approximations and, at times, lower and upper bounds for these
quantities. Here we extend these investigations in a twofold way. First, we
consider a much larger class of distributions, the Rakhmanov distributions
$\rho_n(x)=\omega(x)y_n^2(x)$, where $\{y_n(x)\}$ denote the sequences of
discrete hypergeometric-type polynomials which are orthogonal with respect to
the weight function $\omega(x)$ of Poisson, Pascal, binomial and hypergeometric
types; that is the polynomials of Charlier, Meixner, Kravchuk and Hahn. Second,
we obtain the explicit expressions for the relative Fisher information of these
four families of Rakhmanov distributions with respect to their respective
weight functions.Comment: 18 pages, 10 figure