2008
DOI: 10.1214/07-ejs130
|View full text |Cite|
|
Sign up to set email alerts
|

Kullback Leibler property of kernel mixture priors in Bayesian density estimation

Abstract: Positivity of the prior probability of Kullback-Leibler neighborhood around the true density, commonly known as the Kullback-Leibler property, plays a fundamental role in posterior consistency. A popular prior for Bayesian estimation is given by a Dirichlet mixture, where the kernels are chosen depending on the sample space and the class of densities to be estimated. The Kullback-Leibler property of the Dirichlet mixture prior has been shown for some special kernels like the normal density or Bernstein polynom… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
79
0

Year Published

2009
2009
2023
2023

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 73 publications
(85 citation statements)
references
References 17 publications
6
79
0
Order By: Relevance
“…Such a property plays a very important role in consistency. The result generalizes Theorem 5 of Wu and Ghosal [16].…”
Section: Kullback-leibler Propertysupporting
confidence: 85%
See 1 more Smart Citation
“…Such a property plays a very important role in consistency. The result generalizes Theorem 5 of Wu and Ghosal [16].…”
Section: Kullback-leibler Propertysupporting
confidence: 85%
“…By a theorem of Schwartz [11], this implies weak consistency. Note that Theorem 5 of Wu and Ghosal [16] for the Dirichlet mixture with a scaled type multivariate normal density as its kernel is a special case of Theorem 2 in this paper. In Section 4, we first state a lemma which gives bounds for metric entropies, then state another lemma which gives the sufficient conditions to satisfy Condition (A1).…”
Section: Introductionmentioning
confidence: 83%
“…These are the same conditions as those considered in [Wu and Ghosal, 2008] in the case where the base measure of the Dirichlet process is not data dependent. Sections 1.1 and 2 concern various aspects of Bayesian nonparametric or semiparametric estimation procedures.…”
Section: Plug-in Casementioning
confidence: 90%
“…The nonparametric nature of K can be maintained by putting a prior with large weak support, such as the Dirichlet process. A recent result of [61] shows that nonparametric Bayesian density estimation based on a skew-normal kernel is consistent under the weak topology, adding a strong justification for the use of this kernel. Interestingly, if the theoretical standard normal null distribution is way off from the empirical one, then one can incorporate this feature in the model by allowing K to assign weights to skew-normal components stochastically larger than the standard normal.…”
Section: Dependent Case: Skew-normal Mixture Model For Probit P-valuesmentioning
confidence: 99%