2013
DOI: 10.1109/tit.2013.2274614
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Information Inequalities for Entropic and Almost Entropic Points

Abstract: We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy that hold for distributions whose joint entropies meet some linear constraints. We prove that some conditional information inequalities cannot be extended to any unconditional linear inequalities. Some of these conditional inequalities hold for almost entropic points, while others do not. We also discuss some counterparts of conditional information inequalities for Kolmogorov complexity.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2013
2013
2025
2025

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 22 publications
(33 citation statements)
references
References 28 publications
0
33
0
Order By: Relevance
“…At the same time, [32, Theorem 4.1] implies that no point of ri(E ij ) is entropic. This phenomenon can be equivalently rephrased in terms of conditional information inequalities, studied recently in [23,24,25].…”
Section: Entropy Region Of Four Variablesmentioning
confidence: 99%
“…At the same time, [32, Theorem 4.1] implies that no point of ri(E ij ) is entropic. This phenomenon can be equivalently rephrased in terms of conditional information inequalities, studied recently in [23,24,25].…”
Section: Entropy Region Of Four Variablesmentioning
confidence: 99%
“…A noteworthy fact is that this result cannot be obtained as a direct implication of any unconditional linear inequality for Shannon's entropy. More precisely, whatever pair of reals λ 1 , λ 2 we take, the inequality does not hold for some distribution, see [24]. We claim that Ingleton's inequality holds also under condition (2), which is weaker than (4).…”
Section: A Generalization Of a Conditional Inequality Frommentioning
confidence: 80%
“…These are the conditional linear information inequalities, which hold only for distributions that satisfy some constraints. The first nontrivial example of a conditional linear information inequality was proven in the seminal paper [6]; see a survey of other similar results in [24]. Until now, these inequalities looked like artifacts without practical or theoretical application.…”
Section: Introductionmentioning
confidence: 97%
“…Proposition 5.6 (Double Markov Property [52]): For any X, Y, and Z with 0 ¼ IðX^ZjYÞ ¼ IðY^ZjXÞ, the maximum common function U of X and Y satisfies IðX; Y^ZjUÞ ¼ 0.…”
Section: Data Processing Inequalitymentioning
confidence: 99%