1985
DOI: 10.1007/bf02564838
|View full text |Cite
|
Sign up to set email alerts
|

Strong consistency of the kernel estimators of conditional density function

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2000
2000
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 3 publications
0
4
0
Order By: Relevance
“…Several nonparametric methods have been proposed for estimating a conditional density: Hyndman et al (1996) and Fan et al (1996) have improved the seminal Nadaraya-Watson-type estimator of Rosenblatt (1969) and Lincheng and Zhijun (1985), as well as De Gooijer and Zerom (2003) who introduced another weighted kernel estimator. For these kernel estimators, different methods have been advocated to tackle the bandwidth selection issue: bootstrap approach (Bashtannyk and Hyndman, 2001) or cross-validation variants, see Fan and Yim (2004); Holmes et al (2010), Ichimura and Fukuda (2010).…”
Section: Motivationsmentioning
confidence: 99%
“…Several nonparametric methods have been proposed for estimating a conditional density: Hyndman et al (1996) and Fan et al (1996) have improved the seminal Nadaraya-Watson-type estimator of Rosenblatt (1969) and Lincheng and Zhijun (1985), as well as De Gooijer and Zerom (2003) who introduced another weighted kernel estimator. For these kernel estimators, different methods have been advocated to tackle the bandwidth selection issue: bootstrap approach (Bashtannyk and Hyndman, 2001) or cross-validation variants, see Fan and Yim (2004); Holmes et al (2010), Ichimura and Fukuda (2010).…”
Section: Motivationsmentioning
confidence: 99%
“…Once β(x) is fitted, we can sample from the estimated density f (x) by importance sampling with g(x) as the proposal distribution. In this paper, the density ratio is estimated by nonparametric methods based on the k-nearest-neighbors kernel density estimator (Lincheng & Zhijun 1985), which approximates the density at a point by a kernel smoother applied to the k nearest neighbors of that point, and the Spectral Series estimators (Izbicki et al 2014), which combines orthogonal series expansion and adaptively chosen bases to construct non-parametric likelihood functions. With the density ratio, we are improving the Gaussian likelihood model using non-parametric methods.…”
Section: Multivariate Likelihood Modelsmentioning
confidence: 99%
“…In this paper, we take l = 0, which corresponds to a leave-one-out cross-validated negative log-predictive likelihood. We takef θ to be the hybrid kernel density estimator-k-nearest-neighbor estimator, also known as a kernel nearest-neighbor estimator [39], of the predictive density. The kernel nearest-neighbor estimator of the predictive density of X 0 given X −1 − is given bŷ…”
Section: The Negative Log-predictive Likelihoodmentioning
confidence: 99%