2014
DOI: 10.1109/tit.2014.2338852
|View full text |Cite
|
Sign up to set email alerts
|

Beyond the Entropy Power Inequality, via Rearrangements

Abstract: A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. For the special case of Boltzmann-Shannon entropy, this lower bound is better than that given by the entropy power inequality. Several applications are discussed, including a new proof of the classical entropy power inequality and an entropy inequality involving symmetrization of Lévy processes.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
65
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 67 publications
(67 citation statements)
references
References 43 publications
2
65
0
Order By: Relevance
“…Most of the existing results are on the second derivative of the differential entropy (or the mutual information), and on generalizing the EPI to other settings. For example: Guo et al [11] represents the derivatives in the signal-to-noise ratio of the mutual information in terms of the minimum mean-square estimation error, building on de Bruijn's identity [2]; Wibisono and Jog [12] study the mutual information along the density flow defined by the heat equation and show that it is a convex function of time if the initial distribution is log-concave; Wang and Madiman [13] recover the proof of the EPI via rearrangements; Courtade [14] generalizes Costa's EPI to non-Gaussian additive perturbations; and König and Smith [15] propose a quantum version of the EPI.…”
Section: Introductionmentioning
confidence: 99%
“…Most of the existing results are on the second derivative of the differential entropy (or the mutual information), and on generalizing the EPI to other settings. For example: Guo et al [11] represents the derivatives in the signal-to-noise ratio of the mutual information in terms of the minimum mean-square estimation error, building on de Bruijn's identity [2]; Wibisono and Jog [12] study the mutual information along the density flow defined by the heat equation and show that it is a convex function of time if the initial distribution is log-concave; Wang and Madiman [13] recover the proof of the EPI via rearrangements; Courtade [14] generalizes Costa's EPI to non-Gaussian additive perturbations; and König and Smith [15] propose a quantum version of the EPI.…”
Section: Introductionmentioning
confidence: 99%
“…Remark 4.6. Recent work of Bobkov and Chistyakov [4] and of Wang and Madiman [22] has studied formulations of Shannon's Entropy Power Inequality (EPI) for Rényi entropy of the convolution of probability densities f i in R d . To be specific, [4], Theorem 1, shows that for q > 1, the Rényi entropy power N R,q satisfies…”
Section: For Q > 1 If the Rényi Entropy Is Concave Then So Is The Tmentioning
confidence: 99%
“…Since a strengthened form of the EPI (originally due to Costa [8]) proves the entropy power is concave on convolution with a Gaussian, the Shepp-Olkin conjecture may relate to some form of a discrete EPI. It is interesting that [22], Section 7, proves their main theorem only using majorization theory; Shepp and Olkin's original paper [17] showed that the entropy of Bernoulli sums satisfies the (weaker) property of Schur concavity. here (A.4) follows by the arithmetic mean-geometric mean inequality, and (A.5) follows by the assumptions β 2 ≤ αγ and B 2 ≤ AC, and (A.6) uses the fact that by assumption (iii) U (s) is decreasing in s.…”
Section: For Q > 1 If the Rényi Entropy Is Concave Then So Is The Tmentioning
confidence: 99%
“…Yet another proof based on properties of mutual information was proposed in [11], [12]. A more involved proof based on a stronger form of the EPI that uses spherically symmetric rearrangements, also related to Young's inequality with sharp constant, was recently given by Wang and Madiman [13].…”
Section: Introductionmentioning
confidence: 99%
“…(10)], [12] and [19,Eq. (25)], it is interesting to note that in this context, Fisher's information and MMSE are complementary quantities; • proofs in [4], [7], [13], [20] are related to Young's inequality with sharp constant or to an equivalent argumentation using spherically symmetric rearrangements, and/or the consideration of convergence of Rényi entropies. It should also be noted that not all of the available proofs of (2) settle the equality case-that equality in (2) holds only for Gaussian random vectors with identical covariances.…”
Section: Introductionmentioning
confidence: 99%