2008
DOI: 10.1098/rspa.2008.0235
|View full text |Cite
|
Sign up to set email alerts
|

A law of large numbers for nearest neighbour statistics

Abstract: In practical data analysis, methods based on proximity (near-neighbour) relationships between sample points are important because these relations can be computed in time O(n log n) as the number of points n/N. Associated with such methods are a class of random variables defined to be functions of a given point and its nearest neighbours in the sample. If the sample points are independent and identically distributed, the associated random variables will also be identically distributed but not independent. Despi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0
1

Year Published

2008
2008
2018
2018

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 18 publications
(23 citation statements)
references
References 24 publications
0
22
0
1
Order By: Relevance
“…Denoting as the distance from to its th nearest neighbor in , we express the (conditional) mean distance to the th nearest neighbor using (27) as (31) where, as before, the expectation is taken over . Evans et al [29] derived a more general expression for the distance to the th nearest neighbor as (32) where using the approximation (33) one obtains the same relation as in (31). Note that in our case, we consider a fixed search radius of which is chosen independent of the image patches.…”
Section: B Relationship Between Denoising Bounds and Entropymentioning
confidence: 99%
“…Denoting as the distance from to its th nearest neighbor in , we express the (conditional) mean distance to the th nearest neighbor using (27) as (31) where, as before, the expectation is taken over . Evans et al [29] derived a more general expression for the distance to the th nearest neighbor as (32) where using the approximation (33) one obtains the same relation as in (31). Note that in our case, we consider a fixed search radius of which is chosen independent of the image patches.…”
Section: B Relationship Between Denoising Bounds and Entropymentioning
confidence: 99%
“…An edge effect means that points located close to the domain borders are problematic because a part of the circle inside which points are supposed to be counted is outside the domain. Result of ignoring this effect is underestimating K. D. Evans [14] studies a class of random variables defined to be functions of a given point and its nearest neighbors. If the sample points are independent and identically distributed, the associated random variables will also be identically distributed but not independent.…”
Section: Introductionmentioning
confidence: 99%
“…Evans et.al. [19] establish an upper bound on the rates of decay of the variance, while the authors of [9,10] provide upper bounds on the ℓ 1 rate of convergence.…”
Section: Introductionmentioning
confidence: 99%