2015
DOI: 10.3390/e17010438
|View full text |Cite
|
Sign up to set email alerts
|

A Recipe for the Estimation of Information Flow in a Dynamical System

Abstract: Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
60
0
2

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 54 publications
(63 citation statements)
references
References 51 publications
1
60
0
2
Order By: Relevance
“…p(x, y) = p(x)p(y) . As (3) is not bounded from above, normalized versions have been proposed in the literature [8,9]. For continuous variables, a normalized MI is given by…”
Section: Estimation Of Information-theoretic Quantities From Datamentioning
confidence: 99%
See 2 more Smart Citations
“…p(x, y) = p(x)p(y) . As (3) is not bounded from above, normalized versions have been proposed in the literature [8,9]. For continuous variables, a normalized MI is given by…”
Section: Estimation Of Information-theoretic Quantities From Datamentioning
confidence: 99%
“…$ as we can also find the optimal number of bins by a similar algebra and estimate it as follows [8]: (10) where the log posterior pdf of the number of bins is given as follows:…”
Section: Generalized Bayesian Piece-wise Constant Model For Entropy Ementioning
confidence: 99%
See 1 more Smart Citation
“…We use a set of parametric and non-parametric bin-counting methods, such as those introduced by Sturges [69], Dixon and Kronmal [70], Scott [71], Freedman and Diaconis [72], Knuth [73,74], Shimazaki and Shinomoto [75,76], and a recent method for estimating entropy in hydrologic data proposed by Gong et al [67]. In work, we use common techniques reported in the literature, although there are other methods [77].…”
Section: Bin-counting Methods and Entropic Estimatorsmentioning
confidence: 99%
“…Computing TE is a challenging problem due to its computational complexity. Therefore, different numerical recipes have been suggested [5]. TE has already been used for time series analysis in different fields, such as clinical electroencephalography [4, 6, 7], financial data [8], and biophysics [2].…”
Section: Introductionmentioning
confidence: 99%