2010
DOI: 10.1007/s10827-010-0231-x
|View full text |Cite
|
Sign up to set email alerts
|

On directed information theory and Granger causality graphs

Abstract: Directed information theory deals with communication channels with feedback. When applied to networks, a natural extension based on causal conditioning is needed. We show here that measures built from directed information theory in networks can be used to assess Granger causality graphs of stochastic processes. We show that directed information theory includes measures such as the transfer entropy, and that it is the adequate information theoretic framework needed for neuroscience applications, such as connect… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
121
0
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 124 publications
(122 citation statements)
references
References 44 publications
0
121
0
1
Order By: Relevance
“…It is now clear that there is no single universal approach to understanding connectivity in the brain (Amblard and Michel 2011) and approaches extend from non-directional measures such as mutual information (Jirsa and McIntosh 2007) to measures that utilize directional information flow such as transfer entropy (Lungarella and Sporns 2006). Lungarella and Sporns (2006) show that there is a distinct link between the morphology of sensors and information flow and that coding depends on the morphology and dynamics of sensory systems.…”
Section: Computational Models and Network Inferencementioning
confidence: 99%
“…It is now clear that there is no single universal approach to understanding connectivity in the brain (Amblard and Michel 2011) and approaches extend from non-directional measures such as mutual information (Jirsa and McIntosh 2007) to measures that utilize directional information flow such as transfer entropy (Lungarella and Sporns 2006). Lungarella and Sporns (2006) show that there is a distinct link between the morphology of sensors and information flow and that coding depends on the morphology and dynamics of sensory systems.…”
Section: Computational Models and Network Inferencementioning
confidence: 99%
“…Similarly, we can also quantify a degree of causation in bits through calculating the directed information. It is demonstrated by Amblard et al [20]: for linear Gaussian processes, directed information and Granger causality are equivalent. Note that the transfer entropy defined in Eq.…”
Section: Causal Inference: Granger Causality Transfer Entropy and Dmentioning
confidence: 95%
“…Directed information, proposed by Marko [18] and re-formalized by others [19,20], is more general for quantifying directional dependencies, and has recently attracted attention [10,21]. It is modified from the mutual information to capture causal influences, denoted as I(X → Y) for two stochastic processes X and Y.…”
Section: Causal Inference: Granger Causality Transfer Entropy and Dmentioning
confidence: 99%
See 1 more Smart Citation
“…(1), (2), (3) and (4) with effective partial symbolic transfer entropy. For challenge (5), i.e., statistical significance, it may be evaluated by using bootstrapping strategies, surrogate data or random permutations [50,51]. Under the surrogate-based testing scheme, we will assess the significance with kernel density estimation.…”
Section: Drive-response Network Inferencementioning
confidence: 99%