2009 IEEE International Symposium on Information Theory 2009
DOI: 10.1109/isit.2009.5205880
|View full text |Cite
|
Sign up to set email alerts
|

Concavity of entropy under thinning

Abstract: Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T_a. That is, if X and Y are independent random variables on Z_+ with ultra-log-concave probability mass functions, then H(T_a X+T_{1-a} Y)>= a H(X)+(1-a)H(Y), 0 <= a <= 1, where H denotes the discrete entropy. This is a discrete analogue of the inequality (h denotes the differential entropy) h(sqrt(a) X + sqrt{1-a} Y)>= a h(X)+(1-a) h(Y), 0 <= a <= 1, which holds for … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
6
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 15 publications
(27 citation statements)
references
References 13 publications
0
27
0
Order By: Relevance
“…The thinning has been introduced by Rényi [27] as a discrete analogue of the rescaling of a continuous real random variable. The thinning has been involved with this role in discrete versions of the central limit theorem [28]- [30] and of the Entropy Power Inequality [31], [32]. Most of these results require the ad hoc hypothesis of the ultra log-concavity (ULC) of the input state.…”
Section: Introductionmentioning
confidence: 99%
“…The thinning has been introduced by Rényi [27] as a discrete analogue of the rescaling of a continuous real random variable. The thinning has been involved with this role in discrete versions of the central limit theorem [28]- [30] and of the Entropy Power Inequality [31], [32]. Most of these results require the ad hoc hypothesis of the ultra log-concavity (ULC) of the input state.…”
Section: Introductionmentioning
confidence: 99%
“…Since then, progress has been limited. In [23], Yu and Johnson considered the thinning operation of Rényi [15], and proved a result which implies concavity of entropy when each p i (t) is proportional to t or 1 − t . Further, [10], Theorem 1.1, proved Theorem 1.2 in the case where each p i (t) is either constant or equal to t.…”
Section: Theorem 12 (Shepp-olkin Theorem) For Any N ≥ 1 the Functimentioning
confidence: 99%
“…Since then, progress was limited. In [106,Theorem 2] it was proved that the entropy is concave when for each i either p i (0) = 0 or p i (1) = 0. (In fact, this follows from the case n = 2 of Theorem 6.3 above).…”
Section: Entropy Concavity and The Shepp-olkin Conjecturementioning
confidence: 99%