2021
DOI: 10.1007/978-3-030-80209-7_68
|View full text |Cite
|
Sign up to set email alerts
|

On Information Links

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…The multivariate MI can be written as an expansion of the entropies on the variables: where S = { X 1 , X 2 , … X N }, and the sum runs over all T subsets of S and | T |denotes the cardinal of T . For example, we have: Similarly the multivariate II for N number of variables has the following expansion (here we use the sign convention of [74]): For three variables the interaction information II ( X, Y, Z ) can be written as follows: Notably, entropies are just sums of II: The second metric explored in this work was TC [71], which quantifies the redundancy or dependency among the variables in the set, which is defined as : It is equal to the Kullback–Leibler divergence between the joint distribution and its marginals, and hence always positive or nul. While II and MI are essentially the same, as they only differ by an alternate sign convention : TC is quite different and quantifies the total amount of II, or the “generalised correlation”, in the sense that it sums over all possible higher order Interactions Informations (over all pairs, triplets…): Just as for two variables, TC and II or MI quantify the statistical dependencies among N variables but in a different way : For TC: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that T ( S ) = 0 [75]. For II or MI: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that II ( T ) = 0 for all subsets T of S, with | T | ≥ 2 [1].…”
Section: Multivariate Information and High-order Statistical Dependen...mentioning
confidence: 99%
See 2 more Smart Citations
“…The multivariate MI can be written as an expansion of the entropies on the variables: where S = { X 1 , X 2 , … X N }, and the sum runs over all T subsets of S and | T |denotes the cardinal of T . For example, we have: Similarly the multivariate II for N number of variables has the following expansion (here we use the sign convention of [74]): For three variables the interaction information II ( X, Y, Z ) can be written as follows: Notably, entropies are just sums of II: The second metric explored in this work was TC [71], which quantifies the redundancy or dependency among the variables in the set, which is defined as : It is equal to the Kullback–Leibler divergence between the joint distribution and its marginals, and hence always positive or nul. While II and MI are essentially the same, as they only differ by an alternate sign convention : TC is quite different and quantifies the total amount of II, or the “generalised correlation”, in the sense that it sums over all possible higher order Interactions Informations (over all pairs, triplets…): Just as for two variables, TC and II or MI quantify the statistical dependencies among N variables but in a different way : For TC: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that T ( S ) = 0 [75]. For II or MI: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that II ( T ) = 0 for all subsets T of S, with | T | ≥ 2 [1].…”
Section: Multivariate Information and High-order Statistical Dependen...mentioning
confidence: 99%
“…The multivariate MI can be written as an expansion of the entropies on the variables: where S = { X 1 , X 2 , … X N }, and the sum runs over all T subsets of S and | T |denotes the cardinal of T . For example, we have: Similarly the multivariate II for N number of variables has the following expansion (here we use the sign convention of [74]): For three variables the interaction information II ( X, Y, Z ) can be written as follows: Notably, entropies are just sums of II: The second metric explored in this work was TC [71], which quantifies the redundancy or dependency among the variables in the set, which is defined as : It is equal to the Kullback–Leibler divergence between the joint distribution and its marginals, and hence always positive or nul.…”
Section: Multivariate Information and High-order Statistical Dependen...mentioning
confidence: 99%
See 1 more Smart Citation