2011
DOI: 10.1109/tit.2011.2111010
|View full text |Cite
|
Sign up to set email alerts
|

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error

Abstract: Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signalto-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 211 publications
(12 citation statements)
references
References 37 publications
0
12
0
Order By: Relevance
“…Each inequality in Lemma 2 is satisfied with equality if and only if the input X is Gaussian. That is, for Gaussian inputs, Var(X|Y ) is constant almost surely, and mmse and differential entropy are maximized [21]. From (29), it is evident that the maximum differential entropy of the conditional mean is achieved when the input X is Gaussian even though it minimizes the variance of E[X|Y ], which we record in the following corollary:…”
Section: A Upper Boundsmentioning
confidence: 75%
See 1 more Smart Citation
“…Each inequality in Lemma 2 is satisfied with equality if and only if the input X is Gaussian. That is, for Gaussian inputs, Var(X|Y ) is constant almost surely, and mmse and differential entropy are maximized [21]. From (29), it is evident that the maximum differential entropy of the conditional mean is achieved when the input X is Gaussian even though it minimizes the variance of E[X|Y ], which we record in the following corollary:…”
Section: A Upper Boundsmentioning
confidence: 75%
“…1. Properties of MMSE such as monotonicity, convexity, and infinite differentiability as a function of snr have been shown in [21], while its functional properties as a function of inputoutput distribution have been analyzed in [22], [23]. Recently, in [24], [25], the authors have focused on the derivatives of the conditional mean with respect to the observation, and many previously known identities in the literature have been recovered.…”
Section: B Related Workmentioning
confidence: 99%
“…Existing approaches for noise variance estimation equate the mean value in (60) with the available sample r MS m,n and provide the following noise variance estimate [51,52]:…”
Section: Noise Variance Estimation When θ M * ∈ H Mmentioning
confidence: 99%
“…where Equation (25a) holds by substituting t i witht i in Equation 20, Equation (25b) is due to the optimality of t i at the next iteration, and Equation (25c) is due to the concavity of T k,i w.r.t. t i [46]. Hence, according to [47], the proposed SCA algorithm will converge to a point leading to a locally optimal solution of Equation (17), which satisfies KKT conditions of the original Equation (17).…”
Section: Update Of P Ki With Given σ 2 Aimentioning
confidence: 99%