1995
DOI: 10.1103/physreve.52.6841
|View full text |Cite|
|
Sign up to set email alerts
|

Estimating functions of probability distributions from a finite set of samples

Abstract: This paper is the first of two on the problem of estimating a function of a probability distribution from a finite set of samples of that distribution. In this paper a Bayesian analysis of this problem is presented, the optimal properties of the Bayes estimators are discussed, and as an example of the formalism, closed form expressions for the Bayes estimators for the moments of the Shannon entropy function are derived. Numerical results are presented that compare the Bayes estimator to the frequency-counts es… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
202
0
4

Year Published

1996
1996
2017
2017

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 157 publications
(208 citation statements)
references
References 15 publications
2
202
0
4
Order By: Relevance
“…A better strategy seems to consist in using Eq. 36. But notice that also this can lead to systematic errors in either direction, if bad prior estimates are used.…”
Section: ͑8͒mentioning
confidence: 99%
“…A better strategy seems to consist in using Eq. 36. But notice that also this can lead to systematic errors in either direction, if bad prior estimates are used.…”
Section: ͑8͒mentioning
confidence: 99%
“…5). From these distributions, entropy was calculated via seven different estimation techniques that vary in degree of bias, standard deviation and computational complexity: the classical direct technique (Shannon & Weaver, 1949), Ma lower bound (Ma, 1981), best upper bound (Paninski, 2003), Treves-Panzeri-Miller-Carlton (Treves and Panzeri, 1995;Miller, 1955;Carlton, 1969), Jackknife (Efron & Tibshirani, 1993), Wolpert-Wolf (Wolpert & Wolf, 1994;Wolpert & Wolf, 1995) and Chao-Shen (Chao & Shen, 2003). As an example, with the classic estimation technique we found the direct entropy estimate H Dir as:…”
Section: Entropy Estimationmentioning
confidence: 99%
“…Knuth [22] proposed a Bayesian approach, implemented in Matlab and Python and known as the Knuth method, to estimate the probability distributions using a piecewise constant model incorporating the optimal bin-width estimated from data. Wolpert and Wolf [23] provided a successful Bayesian approach to estimate the mean and the variance of entropy from data. Nemenman et al [24] utilized a mixture of Dirichlet distributions-based prior in their Bayesian Nemenman, Shafee, and Bialek (NSB) entropy estimator.…”
Section: Introductionmentioning
confidence: 99%