2009 IEEE International Workshop on Machine Learning for Signal Processing 2009
DOI: 10.1109/mlsp.2009.5306242
|View full text |Cite
|
Sign up to set email alerts
|

On the statistical estimation of Rényi entropies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 13 publications
0
11
0
Order By: Relevance
“…Several estimators of entropy measures have been proposed for general multivariate densities f . These include consistent estimators based on entropic graphs [19,36], gap estimators [43], nearest neighbor distances [17,26,29,45], kernel density plug-in estimators [1,11,3,18,4,16], Edgeworth approximations [21], convex risk minimization [35] and orthogonal projections [25].…”
Section: Introductionmentioning
confidence: 99%
“…Several estimators of entropy measures have been proposed for general multivariate densities f . These include consistent estimators based on entropic graphs [19,36], gap estimators [43], nearest neighbor distances [17,26,29,45], kernel density plug-in estimators [1,11,3,18,4,16], Edgeworth approximations [21], convex risk minimization [35] and orthogonal projections [25].…”
Section: Introductionmentioning
confidence: 99%
“…Note that (40) provides us with an estimate of the entropy based upon the th most similar patch and, thus, varies with the choice of . Estimators of the Shannon (and Rényi) entropy based upon a combination of such estimates obtained using multiple values of have been proposed in [40]and [41]. However, such estimators require computation of distances to the -most similar patches for each patch in the cluster, a process that can be quite time consuming.…”
Section: Appendix a Entropy Estimationmentioning
confidence: 99%
“…The gradients are derived with respect to the model parameters specifically for the Gaussian channel model of the form (41) where and are the input and output of the Gaussian channel respectively, is a deterministic matrix, and is iid Gaussian noise. The authors in [26] show that achieves a slightly better estimate of the entropy than using N = 4.…”
Section: Appendix B Relation Between MI and Mmse Matrixmentioning
confidence: 99%
“…Because the probability P (X 1 ∈ S 2 ) turns out to admit a convenient asymptotic expression, it is possible to use Equation (8) to estimate the quantity…”
Section: Now If We Takementioning
confidence: 99%
“…The expectation (1) is also of interest in its own right and tends to appear under various scientific contexts. A significant application is found in the nonparametric estimation of Rényi entropies, where asymptotic analysis provides theoretically sound estimators [5,6,8]. Moreover, nearest neighbor distances and distributions play a major role in the understanding of nonparametric estimation in general [1,4].…”
Section: Introductionmentioning
confidence: 99%