2014
DOI: 10.1111/rssb.12059
|View full text |Cite
|
Sign up to set email alerts
|

Marginally Specified Priors for Non-Parametric Bayesian Estimation

Abstract: Summary Prior specification for non-parametric Bayesian inference involves the difficult task of quantifying prior knowledge about a parameter of high, often infinite, dimension. A statistician is unlikely to have informed opinions about all aspects of such a parameter but will have real information about functionals of the parameter, such as the population mean or variance. The paper proposes a new framework for non-parametric Bayes inference in which the prior distribution for a possibly infinite dimensional… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 32 publications
(58 reference statements)
0
8
0
Order By: Relevance
“…In the context of a single probability measure, Kessler, Hoff and Dunson (2015) proposed a clever construction of a BNP model with a given distribution on a finite set of functionals. Their approach is based on the conditional distribution of a standard BNP prior, given the functionals of interest.…”
Section: Discussionmentioning
confidence: 99%
“…In the context of a single probability measure, Kessler, Hoff and Dunson (2015) proposed a clever construction of a BNP model with a given distribution on a finite set of functionals. Their approach is based on the conditional distribution of a standard BNP prior, given the functionals of interest.…”
Section: Discussionmentioning
confidence: 99%
“…In such cases, it is of interest to use previous time points or previous samples results as prior information, in order to leverage all the information available to estimate the mixture. However, specifying prior information to Dirichlet process mixture models is not straightforward (Kessler et al, 2015). Here we propose to use the posterior MCMC draws obtained from previous dataset y (i) as prior information to analyze the next dataset y (i+1) .…”
Section: Sequential Posterior Approximationmentioning
confidence: 99%
“…Thus, a natural future avenue of research is to combine nonparametric or highly flexible models of the distribution of Y with sequentially additive nonignorable missingness mechanisms. For example, we could incorporate information on summaries of distributions into nonparametric Bayesian models using the approach of Kessler et al (2015).…”
Section: Logitf (Mmentioning
confidence: 99%