“…The multivariate MI can be written as an expansion of the entropies on the variables: where S = { X 1 , X 2 , … X N }, and the sum runs over all T subsets of S and | T |denotes the cardinal of T . For example, we have: Similarly the multivariate II for N number of variables has the following expansion (here we use the sign convention of [74]): For three variables the interaction information II ( X, Y, Z ) can be written as follows: Notably, entropies are just sums of II: The second metric explored in this work was TC [71], which quantifies the redundancy or dependency among the variables in the set, which is defined as : It is equal to the Kullback–Leibler divergence between the joint distribution and its marginals, and hence always positive or nul. While II and MI are essentially the same, as they only differ by an alternate sign convention : TC is quite different and quantifies the total amount of II, or the “generalised correlation”, in the sense that it sums over all possible higher order Interactions Informations (over all pairs, triplets…): Just as for two variables, TC and II or MI quantify the statistical dependencies among N variables but in a different way : - For TC: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that T ( S ) = 0 [75].
- For II or MI: it is equivalent to say that the N variables { X 1 , X 2 , … X N } = S are statistically independent or that II ( T ) = 0 for all subsets T of S, with | T | ≥ 2 [1].
…”