2015
DOI: 10.22237/jmasm/1446350940
|View full text |Cite
|
Sign up to set email alerts
|

New Entropy Estimators with Smaller Root Mean Squared Error

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…As mentioned before, it is well known that there is not an unbiased estimator [ 4 ] and of minimum variance, while the convergence rate of a consistent estimators can be arbitrarily slow [ 5 ]. Two well-known statistical tools of estimation theory will be used to experimentally evaluate the selected estimators, bias and mean square error [ 48 ]. The bias of an estimator measures the deviation of the estimate with respect to the expected real value and is calculated from the difference , where is the expected value of In our scenario, we will work with uniformly distributed samples of bytes and bits so the data follows a uniform distribution and we have that .…”
Section: Preliminariesmentioning
confidence: 99%
“…As mentioned before, it is well known that there is not an unbiased estimator [ 4 ] and of minimum variance, while the convergence rate of a consistent estimators can be arbitrarily slow [ 5 ]. Two well-known statistical tools of estimation theory will be used to experimentally evaluate the selected estimators, bias and mean square error [ 48 ]. The bias of an estimator measures the deviation of the estimate with respect to the expected real value and is calculated from the difference , where is the expected value of In our scenario, we will work with uniformly distributed samples of bytes and bits so the data follows a uniform distribution and we have that .…”
Section: Preliminariesmentioning
confidence: 99%