2020
DOI: 10.1016/j.csda.2020.106954
|View full text |Cite
|
Sign up to set email alerts
|

Posterior inference for sparse hierarchical non-stationary models

Abstract: Gaussian processes are valuable tools for non-parametric modelling, where typically an assumption of stationarity is employed. While removing this assumption can improve prediction, fitting such models is challenging. In this work, hierarchical models are constructed based on Gaussian Markov random fields with stochastic spatially varying parameters. Importantly, this allows for non-stationarity while also addressing the computational burden through a sparse banded representation of the precision matrix. In th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(26 citation statements)
references
References 38 publications
0
26
0
Order By: Relevance
“…A more general approach to GPR is to employ parameterized nonstationary covariance kernels. Nonstationary kernels can be obtained by modifying stationary covariance kernels, e.g., [41,27,32,37,4,28], or from neural networks with specific activation functions, e.g., [29,34], among other approaches. Many of these approaches assume a specific functional form for the correlation function, chosen according to expert knowledge.…”
Section: )mentioning
confidence: 99%
“…A more general approach to GPR is to employ parameterized nonstationary covariance kernels. Nonstationary kernels can be obtained by modifying stationary covariance kernels, e.g., [41,27,32,37,4,28], or from neural networks with specific activation functions, e.g., [29,34], among other approaches. Many of these approaches assume a specific functional form for the correlation function, chosen according to expert knowledge.…”
Section: )mentioning
confidence: 99%
“…These include, for example, deep Gaussian processes (Dunlop et al, 2018;Emzir et al, 2020), level-set methods (Dunlop et al, 2017), mixtures of compound Poisson processes and Gaussians (Hosseini, 2017), and stacked Matérn fields via stochastic partial differential equations (Roininen et al, 2019). The problem with hierarchical priors is that in the posteriors, the parameters and hyperparameters may become strongly coupled, which means that vanilla MCMC methods become problematic and, for example, reparameterizations are needed for sampling the posterior efficiently (Chada et al, 2019;Monterrubio-Gómez et al, 2020). In level-set methods, the number of levels is usually low because experiments have shown that the method deteriorates when the number of levels is increased.…”
Section: Literature Reviewmentioning
confidence: 99%
“…This example is grid-free, at least, and this feature will be desirable for high-dimensional problems of the type considered here. It will be interesting to compare this method to grid-based Bayesian approaches such as [11,24]. If the regressors are defined in terms of grids, for example θ l r are the coefficients of expansion in piecewise linear finite element nodal basis functions [38], then there are similarities between the methods.…”
Section: Mathematical Setup and Algorithmmentioning
confidence: 99%
“…It is clear that neither of the simple regression methods are able to cleanly recover the discontinuity. Now we consider slightly more complicated functions, following the recent work [24]. First, consider X = [−4, 10] and…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation