2008 IEEE International Symposium on Information Theory 2008
DOI: 10.1109/isit.2008.4595471
|View full text |Cite
|
Sign up to set email alerts
|

Thinning and information projections

Abstract: In this paper we establish lower bounds on information divergence of a distribution on the integers from a Poisson distribution. These lower bounds are tight and in the cases where a rate of convergence in the Law of Thin Numbers can be computed the rate is determined by the lower bounds proved in this paper. General techniques for getting lower bounds in terms of moments are developed. The results about lower bound in the Law of Thin Numbers are used to derive similar results for the Central Limit Theorem.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

5
16
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(21 citation statements)
references
References 19 publications
5
16
0
Order By: Relevance
“…Letting M have a Poisson distribution in (1) yields the special case of the compound Poisson, which plays an important role in limit theorems and approximation bounds for discrete random variables; see, for example, [2], [3]. Recently, Kontoyiannis and Madiman [18], Madiman et al [20], and Johnson et al [13] have explored compound Poisson approximation and limit theorems using information theoretic ideas, extending the results of [17] and [12] for the Poisson (see also [8], [9], [32]). As a first step toward a compound Poisson limit theorem with the same appealing "entropy increasing to the maximum" interpretation as the central limit theorem ([4], [1], [19], [27]), we need to identify a suitable class of distributions among which the compound Poisson has maximum entropy ( [13]).…”
Section: Introduction and Main Resultsmentioning
confidence: 99%
“…Letting M have a Poisson distribution in (1) yields the special case of the compound Poisson, which plays an important role in limit theorems and approximation bounds for discrete random variables; see, for example, [2], [3]. Recently, Kontoyiannis and Madiman [18], Madiman et al [20], and Johnson et al [13] have explored compound Poisson approximation and limit theorems using information theoretic ideas, extending the results of [17] and [12] for the Poisson (see also [8], [9], [32]). As a first step toward a compound Poisson limit theorem with the same appealing "entropy increasing to the maximum" interpretation as the central limit theorem ([4], [1], [19], [27]), we need to identify a suitable class of distributions among which the compound Poisson has maximum entropy ( [13]).…”
Section: Introduction and Main Resultsmentioning
confidence: 99%
“…that Pr(X i = 0) is close to one, Pr(X i = 1) is uniformly small, and Pr(X i > 1) is negligible compared to Pr(X i = 1); and ii) the dependence between the X i 's is sufficiently weak. In the version considered by Harremoës et al [15] [16] and in this paper, the X i 's are i.i.d. random variables obtained from a common distribution through thinning.…”
Section: Introductionmentioning
confidence: 85%
“…The last inequality holds since ∞ s=0 s(T α f ) s = λα. Having established the convexity of l(α), we can now deduce the full Proposition using (6).…”
Section: Propositionmentioning
confidence: 99%
“…It plays a significant role in the derivation of a maximum entropy property for the Poisson distribution (Johnson [7]). Recently there has been evidence that, in a number of problems related to information theory, the operation T α is the discrete counterpart of the operation of scaling a random variable by √ α; see [5], [6], [7], [14]. Since scaling arguments can give simple proofs of results such as the Entropy Power Inequality, we believe that improved understanding of the thinning operation could lead to discrete analogues of such results.…”
Section: Introductionmentioning
confidence: 99%