Abstract:Abstract. We describe an information-theoretic approach to the analysis of sequential data, which emphasises the predictive aspects of perception, and the dynamic process of forming and modifying expectations about an unfolding stream of data, characterising these using a set of process information measures. After reviewing the theoretical foundations and the definition of the predictive information rate, we describe how this can be computed for Gaussian processes, including how the approach can be adpated to … Show more
“…Subsequent to our initial work on binding information [17], we learned that Proposition 1 follows from results presented by Han [21], in which Han analyses the space of information measures that can be expressed as linear combinations of entropies. He discovered a duality relation on this space, and identified the dual total correlation as the formal dual of Watanabe's [22] total correlation.…”
Section: Propositionmentioning
confidence: 99%
“…We will not prove Proposition 1 here but a sketch of a potential proof can be found in [17]. It is based on a result that can be obtained for N = 3; in this case, it is relatively easy to find that…”
Section: Propositionmentioning
confidence: 99%
“…The first inequality comes directly from the definition of the binding information (17) or (16), since H(X α |X A\{α} ) ≥ 0 for any discrete random variable X α . To obtain the second inequality, we expand the conditional entropies of (16):…”
Section: Bounds On Binding and Multi-informationmentioning
confidence: 99%
“…, the conditional entropy of one variable given all the others in the sequence, future as well as past, is what we called the residual entropy rate r µ in [17], but was previously identified by Verdú and Weissman [18] as the erasure entropy rate. It can be defined as the limit…”
Section: Entropic Measures Of Statistical Structurementioning
confidence: 99%
“…The proof sketch in [17] rests on finding a similar decomposition into non-negative terms when N > 3.…”
“…Subsequent to our initial work on binding information [17], we learned that Proposition 1 follows from results presented by Han [21], in which Han analyses the space of information measures that can be expressed as linear combinations of entropies. He discovered a duality relation on this space, and identified the dual total correlation as the formal dual of Watanabe's [22] total correlation.…”
Section: Propositionmentioning
confidence: 99%
“…We will not prove Proposition 1 here but a sketch of a potential proof can be found in [17]. It is based on a result that can be obtained for N = 3; in this case, it is relatively easy to find that…”
Section: Propositionmentioning
confidence: 99%
“…The first inequality comes directly from the definition of the binding information (17) or (16), since H(X α |X A\{α} ) ≥ 0 for any discrete random variable X α . To obtain the second inequality, we expand the conditional entropies of (16):…”
Section: Bounds On Binding and Multi-informationmentioning
confidence: 99%
“…, the conditional entropy of one variable given all the others in the sequence, future as well as past, is what we called the residual entropy rate r µ in [17], but was previously identified by Verdú and Weissman [18] as the erasure entropy rate. It can be defined as the limit…”
Section: Entropic Measures Of Statistical Structurementioning
confidence: 99%
“…The proof sketch in [17] rests on finding a similar decomposition into non-negative terms when N > 3.…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.