2007 IEEE International Symposium on Information Theory 2007
DOI: 10.1109/isit.2007.4557201
|View full text |Cite
|
Sign up to set email alerts
|

Infinitely Many Information Inequalities

Abstract: When finite, Shannon entropies of all subvectors of a random vector are considered for the coordinates of an entropic point in Euclidean space. A linear combination of the coordinates gives rise to an unconstrained information inequality if it is nonnegative for all entropic points. With at least four variables no finite set of linear combinations generates all such inequalities. This is proved by constructing explicitly an infinite sequence of new linear information inequalities and a curve in a special geome… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
99
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 133 publications
(99 citation statements)
references
References 14 publications
0
99
0
Order By: Relevance
“…It has been a long-standing open question to determine whether there are any further universal entropy inequalities apart from the classical ones [11][12][13]. For probability distributions, this question has been answered to the affirmative in the breakthrough works [14,15]: for n ≥ 4 random variables, there is an infinite number of independent entropy inequalities satisfied by the Shannon entropy. In network coding theory, they give rise to tighter capacity bounds [16].…”
Section: Jhep09(2015)130mentioning
confidence: 99%
See 1 more Smart Citation
“…It has been a long-standing open question to determine whether there are any further universal entropy inequalities apart from the classical ones [11][12][13]. For probability distributions, this question has been answered to the affirmative in the breakthrough works [14,15]: for n ≥ 4 random variables, there is an infinite number of independent entropy inequalities satisfied by the Shannon entropy. In network coding theory, they give rise to tighter capacity bounds [16].…”
Section: Jhep09(2015)130mentioning
confidence: 99%
“…For these, not only are there extreme rays for n ≥ 3 that can only be attained approximately, so that the corresponding entropy cones are not closed, but there are in fact linear inequalities that constrain some of the lower-dimensional faces when n ≥ 4 [12,17,18]. It is moreover known for the Shannon entropy (and likewise conjectured for the von Neumann entropy) that the cones are not polyhedral for n ≥ 4 [15].…”
Section: Jhep09(2015)130mentioning
confidence: 99%
“…Using the same technique proposed in [1], more and more linear information inequalities have been discovered [8,[13][14][15]. Later in [9], Matúš obtained a countable infinite set of linear information inequalities for a set of four random variables. Using the same set of inequalities, Matúš further proved thatΓ * (N 4 ) is not a polyhedral.…”
Section: Non-polyhedral Propertymentioning
confidence: 99%
“…This particular approach for construction has been the main ingredient in every non-Shannon inequality that has been subsequently discovered. Using this approach, new inequalities can be found mechanically [8] and there are in fact infinitely many such independent inequalities even when there are only four random variables involved [9]. Despite this progress, a complete characterisation is still missing however.…”
Section: Introductionmentioning
confidence: 99%
“…The non-Shannon inequalities provide outer bounds for the entropy region and it has been shown that no finite number of these inequalities can characterize the entropy region completely [13]. Innerbounds are less often studied in the literature and the most well-known inner region is defined through the Ingleton inequality which was first obtained for the ranks of vectors spaces [14].…”
Section: A What Is Known About the Entropy Region?mentioning
confidence: 99%