2012 IEEE International Symposium on Information Theory Proceedings 2012
DOI: 10.1109/isit.2012.6283535
|View full text |Cite
|
Sign up to set email alerts
|

Shannon entropy convergence results in the countable infinite case

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
18
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(20 citation statements)
references
References 18 publications
2
18
0
Order By: Relevance
“…Consequently, in finite alphabets the task of entropy estimation is simpler than estimating the distribution in terms of sampling complexity. These findings are consistent with the observation that the entropy is a continuous functional of the space of distributions (in the total variational distance sense) for the finite alphabet case [ 2 , 23 , 24 , 25 ].…”
Section: Introductionsupporting
confidence: 91%
“…Consequently, in finite alphabets the task of entropy estimation is simpler than estimating the distribution in terms of sampling complexity. These findings are consistent with the observation that the entropy is a continuous functional of the space of distributions (in the total variational distance sense) for the finite alphabet case [ 2 , 23 , 24 , 25 ].…”
Section: Introductionsupporting
confidence: 91%
“…Moving to the regime of no gain in minimax redundancy, we asume that k n ≥ u * f (n) eventually with n. We adopt results from the seminal work of Haussler and Opper [11] that offers a lower bound for the mutual information and consequently, the channel capacity that corresponds to the information radius of a family of distributions. The proof of 16 Notice that: this lower bound uses a covering argument and the adoption of the metric entropy with respect to the Hellinger distance 17 , d H : P(X ) × P(X ) −→ R + , for Λ f .…”
Section: ) Regime Of No Gain In Minimax Redundancymentioning
confidence: 99%
“…This last result is well-known for finite alphabets, however its extension to countably infinite alphabets is not straightforward due to the discontinuity of the entropy. The interested reader may be refer to[17],[18] for further details.September 23, 2018 DRAFT…”
mentioning
confidence: 99%
“…The problem of Shannon entropy estimation is revisited from the perspective of understating the convergence properties of the entropy in the countably infinite case [1]. It is known that the entropy is not continuous function in the countable alphabet case [2], [3], [4].…”
Section: Introductionmentioning
confidence: 99%
“…17] on the limiting probability measure µ, for all the sequences {µ n : n ≥ 0} converging in reverse I-divergence to µ [5]. More recently, the work of Silva et al [1] addresses the entropy convergence studying a number of new settings that involve conditions on the limiting measure µ as well as the way the sequence {µ n : n ≥ 0} convergences to µ in the space of distributions. In this paper, we will revisit some of these new results from the perspective of their applicability to the problem of entropy estimation [9], [10], [11], [4].…”
Section: Introductionmentioning
confidence: 99%