2007
DOI: 10.1214/07-ejs098
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian inference with rescaled Gaussian process priors

Abstract: We use rescaled Gaussian processes as prior models for functional parameters in nonparametric statistical models. We show how the rate of contraction of the posterior distributions depends on the scaling factor. In particular, we exhibit rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates. To derive our results we establish bounds on small deviation probabilities for smooth stationary Gaussian processes.Comment: Published in at http://dx.doi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
69
0
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 58 publications
(72 citation statements)
references
References 27 publications
2
69
0
1
Order By: Relevance
“…() among others. Posterior asymptotic properties, which are primarily driven by the structure of their reproducing kernel Hilbert space, were studied by Tokdar & Ghosh (), Ghosal & Roy (), Choi & Schervish (), van der Vaart and van Zanten (), Castillo () & Castillo (), Castillo et al . () and Bhattacharya et al .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…() among others. Posterior asymptotic properties, which are primarily driven by the structure of their reproducing kernel Hilbert space, were studied by Tokdar & Ghosh (), Ghosal & Roy (), Choi & Schervish (), van der Vaart and van Zanten (), Castillo () & Castillo (), Castillo et al . () and Bhattacharya et al .…”
Section: Introductionmentioning
confidence: 99%
“…Thus, a random series prior may be regarded as a flexible alternative to a Gaussian process prior. It is interesting to note that the theory of posterior contraction for Gaussian process priors established in van der Vaart & van Zanten (), van der Vaart & van Zanten (), & van der Vaart & van Zanten () use deep properties of Gaussian processes, while relatively elementary techniques lead to comparable posterior contraction rates for finite random series priors. Posterior computation for Gaussian process priors often need reversible jump MCMC procedures (Tokdar, ) typically with a large number of knots to approximate a given Gaussian process.…”
Section: Introductionmentioning
confidence: 99%
“…We included the SE estimator as it is commonly used, and, if the scaling parameter λ is suitably chosen, it has optimal asymptotic convergence rate for all functions in F (A. van der Vaart & van Zanten, 2007). As mentioned above, in the present case, the regularizer of f is its posterior mean under a Brownian motion prior.…”
Section: Simulation Studymentioning
confidence: 99%
“…One can taste a variety of recent applications from the books and surveys such as Mandjes [135] and Willinger et al [183] (models of communication networks) Rasmussen and Williams [150] (Machine Learning), van der Vaart and van Zanten [177] (prior models in Bayesian Statistics).…”
Section: Invitation To Further Readingmentioning
confidence: 99%