2013
DOI: 10.1002/asjc.806
|View full text |Cite
|
Sign up to set email alerts
|

An Elementary Derivation of the Large Deviation Rate Function for Finite State Markov Chains

Abstract: Large deviation theory is a branch of probability theory that is devoted to a study of the "rate" at which empirical estimates of various quantities converge to their true values. The object of study in this paper is the rate at which estimates of the doublet frequencies of a Markov chain over a finite alphabet converge to their true values. In case the Markov process is actually an i.i.d. process, the rate function turns out to be the relative entropy (or Kullback-Leibler divergence) between the true and the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(5 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…is called the edge measure pertaining to P. Observe that the map from an irreducible transition matrix P to its edge measure is one-toone (see, e.g., [8]) and that the set of all edge measures Q(X , E) can be expressed as…”
Section: Irreducible Markov Chainsmentioning
confidence: 99%
“…is called the edge measure pertaining to P. Observe that the map from an irreducible transition matrix P to its edge measure is one-toone (see, e.g., [8]) and that the set of all edge measures Q(X , E) can be expressed as…”
Section: Irreducible Markov Chainsmentioning
confidence: 99%
“…Let the set of all possible types of length-L, stationary, third-order Markov sequences be P L . The cardinality of P L is upper-bounded by (L + 1) 4 as shown in [22]. The type class, T L , of a given type, p ∈ P L , is then defined as the set of all length-L sequences whose types are equal to p:…”
Section: Proof Of Theoremmentioning
confidence: 99%
“…We now use some Large Deviations theory to make an argument about the probability of error in the hypothesis test. 1) Large Deviations Principle: Vidyasagar [22] provides an extensive analysis of large deviations theory for Markov processes. Theorems 3 and 4, shown below, utilize this analysis along with [23, Lemma 1], which allows us to make an argument about the probability of error for the subsequent hypothesis test.…”
Section: Proof Of Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Finite‐state fading channel (FSFC) models are widely adopted for modeling wireless flat‐fading channels. The Gilbert‐Elliott channel (GEC) model can be considered as the rudiment of FSFC models, where a two‐state (good and bad) Markov network state determines the channel error probability.…”
Section: Introductionmentioning
confidence: 99%