2008
DOI: 10.46298/dmtcs.3557
|View full text |Cite
|
Sign up to set email alerts
|

Error bounds in stochastic-geometric normal approximation

Abstract: International audience We provide normal approximation error bounds for sums of the form $\sum_x \xi_x$, indexed by the points $x$ of a Poisson process (not necessarily homogeneous) in the unit $d$-cube, with each term $\xi_x$ determined by the configuration of Poisson points near to $x$ in some sense. We consider geometric graphs and coverage processes as examples of our general results.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…Clearly (2.12) and (2.13) imply central limit theorems whereby both (V − µ V )/σ V and (S − µ S )/σ S converge in distribution to the standard normal, thereby providing an alternative to existing proofs of these central limit theorems [12,13,16]. In the Poissonized setting, nonasymptotic bounds analogous to those in Theorems 2.1 and 2.2 are given in [15] and imply O(n −1/2 ) bounds analogous to (2.12) and (2.13). In the de-Poissonized setting considered here, Chatterjee [3] provides bounds similar to those in (2.12) and (2.13), which hold for general metric spaces, but using the Kantorovich-Wasserstein distance, rather than the Kolmogorov distance considered here, and without providing any explicit constants.…”
Section: Resultsmentioning
confidence: 92%
See 1 more Smart Citation
“…Clearly (2.12) and (2.13) imply central limit theorems whereby both (V − µ V )/σ V and (S − µ S )/σ S converge in distribution to the standard normal, thereby providing an alternative to existing proofs of these central limit theorems [12,13,16]. In the Poissonized setting, nonasymptotic bounds analogous to those in Theorems 2.1 and 2.2 are given in [15] and imply O(n −1/2 ) bounds analogous to (2.12) and (2.13). In the de-Poissonized setting considered here, Chatterjee [3] provides bounds similar to those in (2.12) and (2.13), which hold for general metric spaces, but using the Kantorovich-Wasserstein distance, rather than the Kolmogorov distance considered here, and without providing any explicit constants.…”
Section: Resultsmentioning
confidence: 92%
“…For r > 0, we say that ψ has radius r if ψ(x, X ) is unaffected by the addition of points to, or removal of points from, the point set X at a distance more than r from x, that is, if for all (x, X ) we have ψ(x, X ) = ψ(x, X ∩ B r (x)). The notion of radius is the same as that of range of interaction used in [15]; see also the notion of radius of stabilization, in [15,17] and elsewhere. We also define…”
mentioning
confidence: 99%
“…(i) Comparing (2.17) with existing results. The results at (2.17) are applicable in the setting of volume order scaling of the variances, i.e., when the variances of H s and H ′ n exhibit scaling proportional to s and n. The rate for Poisson input in (2.17) significantly improves upon the rate given by Theorem 2.1 of [29] (see also Lemma 4.4 of [24]), Corollary 3.1 of [5], and Theorem 2.3 in [26], which all contain extraneous logarithmic factors and which rely on dependency graph methods. The rate in (2.17) for binomial input is new, as up to now there are no explicit general rates of normal convergence for sums of stabilizing score functions ξ n of binomial input.…”
Section: Resultsmentioning
confidence: 99%
“…In the papers [6,24,27] abstract central limit theorems for stabilizing functionals are derived and applied to several problems from stochastic geometry. Quantitative bounds for the normal approximation of stabilizing functionals of an underlying Poisson point process are given in [5,26,29,30,46]. These results yield rates of convergence for the Kolmogorov distance of the order 1/ √ Var H s times some extraneous logarithmic factors.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation