2013
DOI: 10.2139/ssrn.2342165
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric Kernel Density Estimation Near the Boundary

Abstract: Standard fixed symmetric kernel type density estimators are known to encounter problems for positive random variables with a large probability mass close to zero. We show that in such settings, alternatives of asymmetric gamma kernel estimators are superior but also differ in asymptotic and finite sample performance conditional on the shape of the density near zero and the exact form of the chosen kernel. We therefore suggest a refined version of the gamma kernel with an additional tuning parameter according t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2016
2016
2018
2018

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(33 citation statements)
references
References 43 publications
(18 reference statements)
0
33
0
Order By: Relevance
“…In this section the practical performances of the Local Likelihood-based estimators are compared to those of their competitors. The same 7 test R + -supported densities as in Malec and Schienle (2014) were considered.…”
Section: Simulation Studymentioning
confidence: 99%
See 2 more Smart Citations
“…In this section the practical performances of the Local Likelihood-based estimators are compared to those of their competitors. The same 7 test R + -supported densities as in Malec and Schienle (2014) were considered.…”
Section: Simulation Studymentioning
confidence: 99%
“…Although those methods may slightly improve the performance compared to their basic versions, they are more complicated to implement. For instance, Malec and Schienle (2014)…”
Section: Simulation Studymentioning
confidence: 99%
See 1 more Smart Citation
“…. ., x n } from a distribution with unknown density of f(x), the associated kernel density estimator can be expressed as [Malec and Schienle, 2014] f^ðxÞ5 1 nb…”
Section: Copula-based Particle Filtermentioning
confidence: 99%
“…KDE is a fundamental data-smoothing problem satisfying the continuity or differentiability property such that data are sampled from a given distribution. Although there are various kernel functions [17], this work refers to a Gaussian kernel function, which is most often used in the literature and is adequate to address non-Gaussian noise [7].…”
Section: Non-gaussian Square-root Unscented Particle Filtermentioning
confidence: 99%