2012
DOI: 10.1103/physreve.86.041901
|View full text |Cite
|
Sign up to set email alerts
|

Framework to study dynamic dependencies in networks of interacting processes

Abstract: The analysis of dynamic dependencies in complex systems such as the brain helps to understand how emerging properties arise from interactions. Here we propose an information-theoretic framework to analyze the dynamic dependencies in multivariate time-evolving systems. This framework constitutes a fully multivariate extension and unification of previous approaches based on bivariate or conditional mutual information and Granger causality or transfer entropy. We define multi-information measures that allow us to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

3
64
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 51 publications
(67 citation statements)
references
References 91 publications
3
64
0
Order By: Relevance
“…An important but not fully explored aspect is that the measures of information dynamics are often used in isolation, thus limiting their interpretational capability. Indeed, recent studies have pointed out the intertwined nature of the measures of information dynamics, and the need to combine their evaluation to avoid misinterpretations about the underlying network properties [4,12,30]. Moreover, the specificity of measures of information storage and transfer is often limited by the fact that their definition incorporates multiple aspects of the dynamical structure of network processes; the high flexibility of information-theoretic measures allows to overcome this limitation by expanding these measures into meaningful quantities [13,29].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…An important but not fully explored aspect is that the measures of information dynamics are often used in isolation, thus limiting their interpretational capability. Indeed, recent studies have pointed out the intertwined nature of the measures of information dynamics, and the need to combine their evaluation to avoid misinterpretations about the underlying network properties [4,12,30]. Moreover, the specificity of measures of information storage and transfer is often limited by the fact that their definition incorporates multiple aspects of the dynamical structure of network processes; the high flexibility of information-theoretic measures allows to overcome this limitation by expanding these measures into meaningful quantities [13,29].…”
Section: Introductionmentioning
confidence: 99%
“…Within this framework, several tools that include the concept of temporal precedence within the computation of standard information-theoretic measures have been proposed to provide a quantitative description of how collective behaviors in multivariate systems arise from the interaction between the individual system components. These tools formalize different information-theoretic concepts applied to a "target" system in the observed dynamical network: the predictive information about the system describes the amount of information shared between its present state and the past history of the whole observed network [4,5]; the information storage indicates the information shared between the present and past states of the target [6,7]; the information transfer defines the information that a group of systems designed as "sources" provide about the present state of the target [8,9]; and the information modification reflects the redundant or synergetic interaction between multiple sources sending information to the target [3,10]. Operational definitions of these concepts have been proposed in recent years, which allow to quantify predictive information through measures of prediction entropy or full-predictability [11,12], information storage through the self-entropy or self-predictability [11,13], information transfer through transfer entropy or Granger causality [14], and information modification through entropy and prediction measures of net redundancy/synergy [11,15] or separate measures derived from partial information decomposition [16,17].…”
Section: Introductionmentioning
confidence: 99%
“…The synergetic effects that we address here, related to the analysis of dynamical influences in multivariate time series, are similar to those encountered in sociological and psychological modeling, where suppressor is the name given to variables that increase the predictive validity of another variable after its inclusion into a linear regression equation [11]. For further details, see also [12], [7], where information-based approaches were applied to address collective influences.…”
Section: Introductionmentioning
confidence: 98%
“…If enough is known about the measures and the way in which they interact, a fruitful approach is to construct mechanism models and compare such models to experimental data. If less is known, a data-driven approach is often needed where their interactions are estimated from data [20]. In other words, when the intrinsic mechanism of real-life network is not clear, the situation we will be concerned with here, the data-driven paradigm might be more suitable.…”
Section: Related Workmentioning
confidence: 99%
“…The relationship of topological measures has been a research topic for several years [4,[14][15][16][17][18][19][20][21][22][23][24][25][26][27][28], and there mainly exist two paradigms, i.e., analytical modeling and data-driven modeling. For a few topological measures of model networks, i.e., networks generated with certain algorithm, some analytical interrelationships are found.…”
Section: Related Workmentioning
confidence: 99%