2018
DOI: 10.48550/arxiv.1805.07088
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Strongly Consistent of Kullback-Leibler Divergence Estimator and Tests for Model Selection Based on a Bias Reduced Kernel Density Estimator

Abstract: In this paper, we study the strong consistency of a bias reduced kernel density estimator and derive a strongly consistent Kullback-Leibler divergence (KLD) estimator. As application, we formulate a goodness-of-fit test and an asymptotically standard normal test for model selection. The Monte Carlo simulation show the effectiveness of the proposed estimation methods and statistical tests.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 18 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?