2014
DOI: 10.3390/e16031396
|View full text |Cite
|
Sign up to set email alerts
|

Infinite Excess Entropy Processes with Countable-State Generators

Abstract: We present two examples of finite-alphabet, infinite excess entropy processes generated by stationary hidden Markov models (HMMs) with countable state sets. The first, simpler example is not ergodic, but the second is. These are the first explicit constructions of processes of this type.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(41 citation statements)
references
References 16 publications
0
40
1
Order By: Relevance
“…This implies rather small spectral gaps if the process has a countable infinity of causal states-e.g., as in Ref. [41]-or a distribution of eigenvalues heavily weighted near λ = 0, if the process has an uncountable infinity of causal states. In short, complex processes [4,14] are those for which sequence-based algorithms are most likely to fail.…”
Section: Curse Of Dimensionality In Predictive Rate-distortionmentioning
confidence: 99%
“…This implies rather small spectral gaps if the process has a countable infinity of causal states-e.g., as in Ref. [41]-or a distribution of eigenvalues heavily weighted near λ = 0, if the process has an uncountable infinity of causal states. In short, complex processes [4,14] are those for which sequence-based algorithms are most likely to fail.…”
Section: Curse Of Dimensionality In Predictive Rate-distortionmentioning
confidence: 99%
“…An interesting contrast is the log excess entropy divergence seen at the onset of chaos through period-doubling, associated with pushdown stack mechanism [21,Sec. 4.5.1], and seen in the branching copy process [24]. At this stage, though, the possibility of unique associations between the form of information measure divergence and mechanism is not sufficiently well explored.…”
Section: Discussionmentioning
confidence: 99%
“…We calculate E( ) for renewal processes with infinite µ via an analysis technique inspired by Ref. [24] and then calculate E as a limit of E( ) as tends to infinity, seemingly valid for ergodic processes. First, we would like to directly calculate E( ) in terms of forward and reversetime causal states [47]:…”
Section: Appendix B: Renewal Processes With Infinite Mean Intervent Imentioning
confidence: 99%
“…In these examples, the component-process excess entropy I[X −M :0 ; X 0:N |Θ] = I[X −M :0 ; X 0:N |Θ = θ] θ does not diverge with M or N , since finite HMMs have finite excess entropy, which is bounded by the internal state entropy [4,32]. In fact, the excess entropy for many ergodic stochastic processes is finite, even if generated by infinite-state HMMs.…”
Section: Information Analysismentioning
confidence: 99%
“…One of the few tractable and general con- * chaos@ucdavis.edu † smarzen@berkeley.edu structions is the class of Bandit processes constructed from repeated trials of an experiment whose properties are, themselves, varying stochastically from trial to trial [11,12]. Even if each individual trial is a realization generated by a stationary process with finite memory and exponentially decaying correlations, the resulting process over many trials can be infinitary [3][4][5]. Why can the past-future mutual information of Bandit processes diverge?…”
Section: Introductionmentioning
confidence: 99%