2011
DOI: 10.1214/11-aos883
|View full text |Cite
|
Sign up to set email alerts
|

Bandwidth selection in kernel density estimation: Oracle inequalities and adaptive minimax optimality

Abstract: We address the problem of density estimation with Ls-loss by selection of kernel estimators. We develop a selection procedure and derive corresponding Ls-risk oracle inequalities. It is shown that the proposed selection rule leads to the estimator being minimax adaptive over a scale of the anisotropic Nikol'skii classes. The main technical tools used in our derivations are uniform bounds on the Ls-norms of empirical processes developed recently by Goldenshluger and Lepski

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
233
0
4

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 188 publications
(244 citation statements)
references
References 24 publications
1
233
0
4
Order By: Relevance
“…The choice of K proposed in in Kerkyacharian et al (2001) and in Goldenshluger and Lepski (2014) implies, see for instance Goldenshluger and Lepski (2011), for any 1…”
Section: Particular Casesmentioning
confidence: 99%
“…The choice of K proposed in in Kerkyacharian et al (2001) and in Goldenshluger and Lepski (2014) implies, see for instance Goldenshluger and Lepski (2011), for any 1…”
Section: Particular Casesmentioning
confidence: 99%
“…Our selection method of the dimension parameter is inspired by the work of Goldenshluger and Lepski [2011] and combines the techniques of model selection and Lepski's method. We determine the dimension parameter among a collection of admissible values by minimizing a penalized contrast function.…”
Section: Methodology and Backgroundmentioning
confidence: 99%
“…, Z n from a minimax point of view. The estimator is based on an orthogonal series approach where the fully data-driven selection of the dimension parameter is inspired by the recent work of Goldenshluger and Lepski [2011]. We derive conditions that allow us to bound the maximal risk of the fully data-driven estimator over suitable chosen classes F for f , which are constructed flexibly enough to characterize, in particular, differentiable or analytic functions.…”
Section: Introductionmentioning
confidence: 99%
“…Data-driven methods exist, such as the one very recently introduced by Goldenshluger and Lepski [25,24,17], which permit one to choose the optimal parameter h.…”
Section: Kernel Density Estimationmentioning
confidence: 99%