“…In fact, the authors applied it to finite-length evaluations of the tail probability, the error probability in simple hypothesis testing, source coding, channel coding, and random number generation in Markov chain as well as the estimation error of parametric family of transition matrices [17,18]. Thus, we revisit the exponential family of transition matrices [2,5] in a manner consistent with the above purpose by using Bregmann divergence [21,20]. In particular, the relative Rényi entropy for transition matrices plays an important role in the finite-length analysis; we define the relative entropy for transition matrices so that it is a special case of the relative Rényi entropy, which is different from the definitions in the literatures [2,5].…”