2016
DOI: 10.3390/e18020038
|View full text |Cite
|
Sign up to set email alerts
|

Understanding Interdependency Through Complex Information Sharing

Abstract: Abstract:The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the var… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
53
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(55 citation statements)
references
References 47 publications
2
53
0
Order By: Relevance
“…It is possible that, were an agreeable multivariate partial information measure to be developed, the decomposition of, e.g., I [(X 0 , X 1 , X 2 ) : X 0 X 1 X 2 ] could lead to a satisfactory symmetric decomposition. In any case, there has been longstanding interest in creating a symmetric decomposition analogous to the partial information decomposition [46] with some recent progress [79][80][81].…”
Section: Discussionmentioning
confidence: 99%
“…It is possible that, were an agreeable multivariate partial information measure to be developed, the decomposition of, e.g., I [(X 0 , X 1 , X 2 ) : X 0 X 1 X 2 ] could lead to a satisfactory symmetric decomposition. In any case, there has been longstanding interest in creating a symmetric decomposition analogous to the partial information decomposition [46] with some recent progress [79][80][81].…”
Section: Discussionmentioning
confidence: 99%
“…They showed that, even though the PID atoms are among the very few measures that can distinguish between the two kinds of systems, a PID lattice with two source variables and one target variable cannot allot the full joint entropy H(X, Y, Z) of either system. The decomposition of the joint entropy in terms of information components that reflect qualitatively different interactions within the system has also been subject of recent research, that however relies on constructions differing substantially from the PID lattice [29,33].…”
Section: Preliminaries and State Of The Artmentioning
confidence: 99%
“…Understanding how information is distributed in trivariate systems should also provide a descriptive allotment of all parts of the joint entropy H(X, Y, Z) [21,33]. For comparison, Shannon's mutual information enables a semantic decomposition of the bivariate entropy H(X, Y) in terms of univariate conditional entropies and I(X : Y), that quantifies shared fluctuations (or covariations) between the two variables [33]:…”
Section: Decomposing the Joint Entropy Of A Trivariate Systemmentioning
confidence: 99%
“…The value of the predictive power will vary between 0% when X is not informative of Y and 100% when X is a perfect predictor of Y, i.e., X ≡ Y. In the literature, there has been numerous proposals for the use of information theory as a measure of temporal evolution [13]. Some are symmetric measures, such as the predictive information [12], and others enable, at the price of numerous acquisition time-points, to determine causal relationships between variables, such as the transfer entropy [14] or the Granger causality [15].…”
Section: Modeling a Shannon-like Communication Channel For Multicompomentioning
confidence: 99%