2021
DOI: 10.1103/physreve.103.l020102
|View full text |Cite
|
Sign up to set email alerts
|

Local Granger causality

Abstract: Granger causality (GC) is a statistical notion of causal influence based on prediction via linear vector autoregression. For Gaussian variables it is equivalent to transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes. We exploit such equivalence and calculate exactly the local Granger causality, i.e., the profile of the information transferred from the driver to the target process at each discrete time point; in this frame, GC is the avera… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 11 publications
(15 citation statements)
references
References 16 publications
2
13
0
Order By: Relevance
“…the local transfer entropy, are a promising set of techniques that can provide a detailed description of information transfer mechanisms in complex systems. Recently this paradigm has been applied to implement the local Granger causality [30], which has shown interesting results on physiological and neural data. The fine descriptions allowed by the formalism introduced in this paper bring a new perspective over high-order interdependencies, which complement existent pointwise information decomposition approaches (e.g.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…the local transfer entropy, are a promising set of techniques that can provide a detailed description of information transfer mechanisms in complex systems. Recently this paradigm has been applied to implement the local Granger causality [30], which has shown interesting results on physiological and neural data. The fine descriptions allowed by the formalism introduced in this paper bring a new perspective over high-order interdependencies, which complement existent pointwise information decomposition approaches (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…An important limitation of the O-information is that it characterises a multivariate system with a single number, which summarises to the aggregated effect of various patterns. Building on the rich literature of pointwise information measures [28][29][30], in this paper we introduce the local O-information, which evaluates each pattern separately -such that its ensemble average recovers the Oinformation. More specifically, the local O-information constitutes an overall measure that characterise the highorder interdependencies between the parts of a multivariate system at each possible pattern of activity.…”
Section: Introductionmentioning
confidence: 99%
“…In other words, it provides "an average" Granger causality from one network to another. However, Stramaglia et al [52] proposed a way to identify local Granger causality. Their method offers a robust and computationally fast method to follow the information transfer and the time history of linear stochastic processes and nonlinear complex systems studied in the Gaussian approximation.…”
Section: Applicationmentioning
confidence: 99%
“…We could combine their approach and ours to identify local Granger causality among networks time series as future work. Besides the work of Stramaglia et al [52], there are other methods for time-varying connectivity inference. For a good review, refer to [53].…”
Section: Applicationmentioning
confidence: 99%
“…It is based on the analysis of time series, and on a very simple and intuitive assumption: given two elements A and B, B is causing A if including information about the past of B helps predict the future of A -as, in other words, B contributes in defining the future of A. Since its introduction, this test has been applied to uncountable problems, from economics [4]- [7], engineering [8], sociology [9], biology [10] or neuroscience [11]- [13]; and has been extended to handle different situations and types of data [14]- [17].…”
Section: Introductionmentioning
confidence: 99%