2016
DOI: 10.11648/j.ajtas.20160501.13
|View full text |Cite
|
Sign up to set email alerts
|

Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency

Abstract: Nonparametric density estimation, based on kernel-type estimators, is a very popular method in statistical research, especially when we want to model the probabilistic or stochastic structure of a data set. In this paper, we investigate the asymptotic confidence bands for the distribution with kernel-estimators for some types of divergence measures (Rényi-α and Tsallis-α divergence). Our aim is to use the method based on empirical process techniques, in order to derive some asymptotic results. Under different … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…We choose two values of π which are π = 0.5 and π = 0.75. Note that the value π = 0.5 is the value for Dhaker et al (2016) showed that the Kullback-Leibler and Hellinger divergences estimators based on the kernel density estimator are strongly consistent estimators. We consider the Kullback-Leibler and the Hellinger divergences based on the bias reduced kernel density estimator ( D KL1 and D H1 ) and we further consider the case of Kullback-Leibler and Hellinger divergences based on the kernel density estimator ( D KL2 and D H2 ).…”
Section: Performance Of φ-Divergence Estimatormentioning
confidence: 99%
See 1 more Smart Citation
“…We choose two values of π which are π = 0.5 and π = 0.75. Note that the value π = 0.5 is the value for Dhaker et al (2016) showed that the Kullback-Leibler and Hellinger divergences estimators based on the kernel density estimator are strongly consistent estimators. We consider the Kullback-Leibler and the Hellinger divergences based on the bias reduced kernel density estimator ( D KL1 and D H1 ) and we further consider the case of Kullback-Leibler and Hellinger divergences based on the kernel density estimator ( D KL2 and D H2 ).…”
Section: Performance Of φ-Divergence Estimatormentioning
confidence: 99%
“…Einmahl and Mason (2005) proved the uniform in bandwidth consistency of kernel-type function estimators. Dhaker et al (2016) proposed a strong uniformly consistent kernel-type estimator of divergence measures. Rudemo (1982) and Bowman et al (1984) introduced a convenient method for the choice of optimal bandwidth in practice for kernel density estimator using cross-validation.…”
Section: Introductionmentioning
confidence: 99%
“…Instead, the literature presented us many kinds of results on almost-sure efficiency of the estimation, with rates of convergences and laws of the iterated logarithm, L p (p = 1, 2) convergence, etc. To be precise, [Dhakher et al(2016)] used recent techniques based on functional empirical process to provide a series of interesting rates of convergence of the estimators in the case of one-sided approach for the class de Renyi, Tsallis, Kullback-Leibler to cite a few. Unfortunately, the authors did not address the problem of integrability, taking for granted that the divergence measures are finite.…”
Section: Statistical Estimationmentioning
confidence: 99%