2019
DOI: 10.3390/e21010088
|View full text |Cite
|
Sign up to set email alerts
|

Symmetries among Multivariate Information Measures Explored Using Möbius Operators

Abstract: Relations between common information measures include the duality relations based on Möbius inversion on lattices, which are the direct consequence of the symmetries of the lattices of the sets of variables (subsets ordered by inclusion). In this paper we use the lattice and functional symmetries to provide a unifying formalism that reveals some new relations and systematizes the symmetries of the information functions. To our knowledge, this is the first systematic examination of the full range of relationshi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 17 publications
0
10
0
Order By: Relevance
“…. , M, are defined by Equations (9)-(12) after plugging the mean and variance given by Equations (20) and (21).…”
Section: Conditional Distributionsmentioning
confidence: 99%
See 3 more Smart Citations
“…. , M, are defined by Equations (9)-(12) after plugging the mean and variance given by Equations (20) and (21).…”
Section: Conditional Distributionsmentioning
confidence: 99%
“…Mutual information, I, between two random variables, X s and X u , compares the uncertainty of measuring variables jointly with the uncertainty of measuring the two variables independently, identifies nonlinear dependence between two variables [41][42][43], and is non-negative and symmetrical. A generalization of bivariate mutual information to more than two variables have been analyzed in few different scenarios [20,21,[41][42][43]. A direct multivariate extension of bivariate mutual information expressed by Equation 45to n variables X 1 , X 2 , and X n is named as the multi-information [44,45], also known as total correlation, and is defined by:…”
Section: Information Measuresmentioning
confidence: 99%
See 2 more Smart Citations
“…The symmetries of the relationships among the information functionals are surprisingly simple, but also subtle. The multiple measures of information theory have strikingly symmetric relations [16,17], and have a number of symmetries that we have previously reported [18]. The symmetries all derive from the fact that all information measures are specific linear combinations of joint entropies, organized by lattices whose partial order is determined by inclusion of variable subsets.…”
Section: Introductionmentioning
confidence: 96%