2015
DOI: 10.1016/j.physa.2015.03.018
|View full text |Cite
|
Sign up to set email alerts
|

A concavity property for the reciprocal of Fisher information and its consequences on Costa’s EPI

Abstract: We prove that the reciprocal of Fisher information of a logconcave probability density X in R n is concave in t with respect to the addition of a Gaussian noise Z t = N (0, tI n ). As a byproduct of this result we show that the third derivative of the entropy power of a log-concave probability density X in R n is nonnegative in t with respect to the addition of a Gaussian noise Z t . For log-concave densities this improves the well-known Costa's concavity property of the entropy power [3].

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
13
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 22 publications
2
13
0
Order By: Relevance
“…Notice that the third derivative of the entropy power N(X + √ tZ) was shown to be nonnegative under the log-concavity condition [5], and we recover this in Corollary 3. We also considered the fourth derivative, but failed to obtain the sign because we were unable to apply the Cauchy-Schwartz inequality as we did for the third derivative.…”
Section: On the Derivativessupporting
confidence: 63%
See 3 more Smart Citations
“…Notice that the third derivative of the entropy power N(X + √ tZ) was shown to be nonnegative under the log-concavity condition [5], and we recover this in Corollary 3. We also considered the fourth derivative, but failed to obtain the sign because we were unable to apply the Cauchy-Schwartz inequality as we did for the third derivative.…”
Section: On the Derivativessupporting
confidence: 63%
“…As a corollary, we recover Toscani's result [5] on the third derivative of the entropy power, using the Cauchy-Schwartz inequality, which is much simpler. In Section 3, we introduce the linear matrix inequality approach, and transform the above two conjectures to the feasibility check of semidefinite programming problems.…”
Section: Introductionsupporting
confidence: 51%
See 2 more Smart Citations
“…Log-concave random vectors and functions are important classes in many disciplines. In the context of information theory, several nice properties involving entropy of log-concave random vectors were recently established (see, e.g., [3,5,18,33,40,41]). Significant examples are Gaussian and exponential distributions as well as any uniform distribution on a convex set.…”
Section: Preliminariesmentioning
confidence: 99%