1976
DOI: 10.1109/tit.1976.1055550
|View full text |Cite
|
Sign up to set email alerts
|

A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
94
0
1

Year Published

1988
1988
2013
2013

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 163 publications
(96 citation statements)
references
References 4 publications
1
94
0
1
Order By: Relevance
“…Non-parametric estimation of Shannon entropy has been of interest to many in non-parametric statistics, pattern recognition, model identification, image registration and other areas [11], [12], [13], [14], [15], [1], [16]. Estimation of «-entropy arises as a step towards Shannon entropy estimation, e.g., Mokkadem [17] constructed a nonparametric estimate of the Shannon entropy from a convergent sequence of «-entropy estimates.…”
Section: Introductionmentioning
confidence: 99%
“…Non-parametric estimation of Shannon entropy has been of interest to many in non-parametric statistics, pattern recognition, model identification, image registration and other areas [11], [12], [13], [14], [15], [1], [16]. Estimation of «-entropy arises as a step towards Shannon entropy estimation, e.g., Mokkadem [17] constructed a nonparametric estimate of the Shannon entropy from a convergent sequence of «-entropy estimates.…”
Section: Introductionmentioning
confidence: 99%
“…Certainly, mutual information-based methods similar to what we have outlined in this paper could be developed to test locality assumptions in spatially distributed data assimilation applications. There are also methods for calculating joint entropies using kernel approximations of underlying probability mass function which do not require reprojections [e.g., Ahmad and Lin, 1976]; however, these methods are ultimately subject to the same curse of dimensionality which limits all nonparametric density estimation, and are only practical up to a few dimensions. In both the Theory and Demonstration sections (sections 2 and 3), we estimated the entropy of one-and two-dimensional probability density functions and considered only one observation at a time.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Assuming we have estimates p 1 , p 2 of the pdfs p 1 , p 2 , we use the Ahmad-Lin entropy estimators [28]:…”
Section: Non-parametric Estimation Of the Similarity Measurementioning
confidence: 99%