2003
DOI: 10.1103/physrevlett.91.238701
|View full text |Cite
|
Sign up to set email alerts
|

Network Information and Connected Correlations

Abstract: Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible N -point correlation is measured by a decrease in entropy for the joint distribution of N variables relative to the maximum entropy allowed by all the observed N − 1 variable distributions. We calculate the "connected information" terms for several examples, and show that it also enables the decomposition of the informati… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

4
381
1
5

Year Published

2006
2006
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 277 publications
(391 citation statements)
references
References 17 publications
4
381
1
5
Order By: Relevance
“…Each of these distributions has its associated entropy, S ðνÞ G ¼ −∑~σP ðνÞ G ð~σÞ log 2 P ðνÞ G ð~σÞ. Following (27), ν-th order correlations within the glider carry I ðνÞ G ¼ S ðν−1Þ G − S ðνÞ G bits of information about local texture. These information-theoretic quantities measure order in a texture that arises from correlations that involve exactly ν pixels.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Each of these distributions has its associated entropy, S ðνÞ G ¼ −∑~σP ðνÞ G ð~σÞ log 2 P ðνÞ G ð~σÞ. Following (27), ν-th order correlations within the glider carry I ðνÞ G ¼ S ðν−1Þ G − S ðνÞ G bits of information about local texture. These information-theoretic quantities measure order in a texture that arises from correlations that involve exactly ν pixels.…”
Section: Resultsmentioning
confidence: 99%
“…Because pixels are not independent, the entropy S G will in general be less than four bits; we can write S G ¼ Q − ∑ Q ν¼1 I ðνÞ G where Q ¼ 4 is the number of binary pixels in the glider, and I ðνÞ G measures the bits of entropy reduction caused by luminance bias (ν ¼ 1), and by pair (ν ¼ 2), triplet (ν ¼ 3), and quadruplet (ν ¼ 4) correlations (27).…”
Section: Resultsmentioning
confidence: 99%
“…A similar measure has been introduced also for multipartite probability distributions in Ref. 18. It will allow us to show that ⌬͑⌿͒ is equal to the number of GHZ states extractable from ͉⌿͘ by arbitrary local unitaries.…”
Section: Beyond Stabilizer Statesmentioning
confidence: 99%
“…The fraction of information retained by describing the system with a given measure, as opposed to the true joint entropy, is then 0 ≤ I m /I N ≤ 1. If the used measure is the bivariate probability distribution, we call I m the pairwise network information, or the second order connected information as defined in [16]. This is approximated linearly if the measure used is the cross-correlation, and nonlinearly if the measure used is the mutual information.…”
mentioning
confidence: 99%
“…The corresponding framework was first laid out in [16] and later applied in [17], where they assessed the rationale of only looking at the pairwise correlations between neurons. They examined how well the maximum entropy distribution, consistent with all the pairwise correlations described the system.…”
mentioning
confidence: 99%