2018
DOI: 10.3390/e20060397
|View full text |Cite
|
Sign up to set email alerts
|

Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators

Abstract: This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in ∞-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 53 publications
1
5
0
Order By: Relevance
“…On the constructive side, constraining the problem to a family of distributions with specific power tail-bounded conditions, Antos et al [36,Theorem 7] presented a finite length expression for the rate of convergence of the estimation error of the classical plug-in estimator. Similar results -strong consistency distribution-free in H(X) and (almost sure) rate of convergence for the estimation error under some tail bounded conditions -have been obtained for a datadriven partition scheme in [37]. 15 .…”
Section: Entropy and Distribution Estimation For Infinite Alphabets: ...supporting
confidence: 74%
See 2 more Smart Citations
“…On the constructive side, constraining the problem to a family of distributions with specific power tail-bounded conditions, Antos et al [36,Theorem 7] presented a finite length expression for the rate of convergence of the estimation error of the classical plug-in estimator. Similar results -strong consistency distribution-free in H(X) and (almost sure) rate of convergence for the estimation error under some tail bounded conditions -have been obtained for a datadriven partition scheme in [37]. 15 .…”
Section: Entropy and Distribution Estimation For Infinite Alphabets: ...supporting
confidence: 74%
“…B, (39) 15 As a side comment, the mentioned tail bounded conditions used in [36], [37] to obtain rate of convergence for entropy estimation are stronger than the condition used in our work to define the envelope families (Definition 4) -used in our achievability results in Theorems 2 and 3 where ⇡ n (A n j ) ⌘ B 2 ⇡ n , A n j \ B 6 = ; . At this point, we can show that C(A n j ) < 1, 8j 2 J .…”
Section: Proofs Of the Main Results Of Section IVmentioning
confidence: 89%
See 1 more Smart Citation
“…We thus only prove the former in Section 2.3 and omit the proof of the latter. We remark that Theorem 1.3 implies that H (lw) k (n) converges, as n → ∞, to the entropy of the limiting distribution of the sequence I k (n) (see, for instance, Proposition 1 in [37] -some care is required because, in general, the entropy is not a continuous function of distributions in infinite spaces, even in a discrete setting [11,37]).…”
Section: Introduction and Statement Of Resultsmentioning
confidence: 99%
“…Even though H(•) is not continuous on ∆ N , the plug-in estimate H( μn ) converges to H(µ) almost surely and in L 2 [Antos and Kontoyiannis, 2001a]. Silva [2018] studied a variety of restrictions on distributions over infinite alphabets to derive strong consistency results and rates of convergence.…”
Section: Related Workmentioning
confidence: 99%