Abstract-In this work, we provide a computable expression for the Kullback-Leibler divergence rate lim ( ) between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. We illustrate it numerically and examine its rate of convergence. The main tools used to obtain the Kullback-Leibler divergence rate and its rate of convergence are the theory of nonnegative matrices and Perron-Frobenius theory. Similarly, we provide a formula for the Shannon entropy rate lim ( ) of Markov sources and examine its rate of convergence.
In this work, we examine the existence and the computation of the Rényi divergence rate, lim (), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. This yields a generalization of a result of Nemetz where he assumed that the initial probabilities under and are strictly positive. The main tools used to obtain the Rényi divergence rate are the theory of nonnegative matrices and Perron-Frobenius theory. We also provide numerical examples and investigate the limits of the Rényi divergence rate as 1 and as 0. Similarly, we provide a formula for the Rényi entropy rate lim () of Markov sources and examine its limits as 1 and as 0. Finally, we briefly provide an application to source coding.
Abstract-In an earlier work, Poor and Verdú established an upper bound for the reliability function of arbitrary single-user discrete-time channels with memory. They also conjectured that their bound is tight for all coding rates. In this note, we demonstrate via a counterexample involving memoryless binary erasure channels (BECs) that the Poor-Verdú upper bound is not tight at low rates. We conclude by examining possible improvements to this bound.Index Terms-Arbitrary channels with memory, binary erasure channels (BECs), channel coding, channel reliability function, information spectrum, probability of error.
In [6], Csiszár established the concept of forward-cutoff rate for the error exponent hypothesis testing problem based on independent and identically distributed (i.i.d.) observations. Given 0, he defined the forward-cutoff rate as the number 0 that provides the best possible lower bound in the form () to the type 1 error exponent function for hypothesis testing where 0 is the rate of exponential convergence to 0 of the type 2 error probability. He then demonstrated that the forward-cutoff rate is given by (), where () denotes the Rényi-divergence [19], 0, = 1. Similarly, for 0 1, Csiszár also established the concept of reverse-cutoff rate for the correct exponent hypothesis testing problem. In this work, we extend Csiszár's results by investigating the forward and reverse-cutoff rates for the hypothesis testing between two arbitrary sources with memory. We demonstrate that the lim inf Rényi-divergence rate provides the expression for the forward-cutoff rate. We also show that if the log-likelihood large deviation spectrum admits a limit, then the reverse-cutoff rate equals the liminf-divergence rate, where = and 0 , where is the largest 1 for which the lim inf-divergence rate is finite. For 1, we show that the reverse cutoff rate is in general only upper-bounded by the lim inf Rényi divergence rate. Unlike in [4], where the alphabet for the source coding cutoff rate problem was assumed to be finite, we assume arbitrary (countable or continuous) source alphabet. We also provide several examples to illustrate our forward and reverse-cutoff rates results and the techniques employed to establish them. Index Terms-divergence rate, arbitrary sources with memory, forward and reverse cutoff rates, hypothesis testing error and correct exponents, information spectrum, large deviation theory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.