2001
DOI: 10.1109/18.923736
|View full text |Cite
|
Sign up to set email alerts
|

Renyi's divergence and entropy rates for finite alphabet Markov sources

Abstract: In this work, we examine the existence and the computation of the Rényi divergence rate, lim (), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. This yields a generalization of a result of Nemetz where he assumed that the initial probabilities under and are strictly positive. The main tools used to obtain the Rényi divergence rate are the theory of nonnegative matrices and Perron-Frob… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
53
0

Year Published

2007
2007
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 58 publications
(54 citation statements)
references
References 15 publications
1
53
0
Order By: Relevance
“…The Rényi entropy rate was first defined by Rached et al [25] for an ergodic Markov chain with a finite state space. The entropy rate of this process is expressed as…”
Section: Introductionmentioning
confidence: 99%
“…The Rényi entropy rate was first defined by Rached et al [25] for an ergodic Markov chain with a finite state space. The entropy rate of this process is expressed as…”
Section: Introductionmentioning
confidence: 99%
“…When the transition matrix has several irreducible components, we need to consider the mixture distribution among the possible irreducible components, which is defined by the initial distribution. As discussed in [54,Theorem 1], in the finite state space, the asymptotic behavior of the (conditional) Rényi entropy is characterized by the maximum (conditional) Rényi entropy among the possible irreducible components, which depend on the initial distribution. Hence, for large deviation and moderate deviation, the exponential decreasing rate of the leaked information can be evaluated by the minimum rate among the possible irreducible components.…”
Section: Discussionmentioning
confidence: 99%
“…For example, the rate of Shannon entropy for a stationary Gaussian process was obtained by Kolmogorov (1958). The rate of Renyi entropy for stochastic processes was obtained by Rached et al (1999).…”
Section: Introductionmentioning
confidence: 99%
“…Up to now, the Renyi entropy rate for some stochastic processes have been studied, for example, by Rached et al (1999Rached et al ( , 2004 and Golshani and Pasha (2010). But in the case of the Tsallis entropy, the rate of entropy for stochastic processes has not been obtained for any processes yet.…”
Section: Introductionmentioning
confidence: 99%