2019
DOI: 10.48550/arxiv.1907.00196
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Statistical estimation of the Kullback-Leibler divergence

Abstract: Wide conditions are provided to guarantee asymptotic unbiasedness and L 2 -consistency of the introduced estimates of the Kullback -Leibler divergence for probability measures in R d having densities w.r.t. the Lebesgue measure. These estimates are constructed by means of two independent collections of i.i.d. observations and involve the specified k-nearest neighbor statistics. In particular, the established results are valid for estimates of the Kullback -Leibler divergence between any two Gaussian measures i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 32 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?