2021
DOI: 10.1111/rssb.12466
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Laplace Approximations for Scalable Model Selection

Abstract: This is an open access article under the terms of the Creat ive Commo ns Attri butio n-NonCo mmerc ial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(22 citation statements)
references
References 29 publications
1
21
0
Order By: Relevance
“…The posterior probability of each model s given by BMS is obtained from Bayes' rule as p(s|y)=pfalse(y0.15emfalse|0.15emsfalse)pfalse(sfalse)pfalse(yfalse)=pfalse(y0.15emfalse|0.15emsfalse)pfalse(sfalse)s=1Sp(y|s)p(s), where p ( s ) is a user‐specified prior model probability and p ( y | s ) is the so‐called integrated likelihood (or marginal likelihood, or evidence) for model s . It can be computed using standard methods, either via closed‐form expressions (when available), Laplace approximations (Kass et al, 1990), approximate Laplace approximations (Rossell et al, 2021) when n or p are large, or Markov Chain Monte Carlo (MCMC) methods (Friel & Wyse, 2012).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The posterior probability of each model s given by BMS is obtained from Bayes' rule as p(s|y)=pfalse(y0.15emfalse|0.15emsfalse)pfalse(sfalse)pfalse(yfalse)=pfalse(y0.15emfalse|0.15emsfalse)pfalse(sfalse)s=1Sp(y|s)p(s), where p ( s ) is a user‐specified prior model probability and p ( y | s ) is the so‐called integrated likelihood (or marginal likelihood, or evidence) for model s . It can be computed using standard methods, either via closed‐form expressions (when available), Laplace approximations (Kass et al, 1990), approximate Laplace approximations (Rossell et al, 2021) when n or p are large, or Markov Chain Monte Carlo (MCMC) methods (Friel & Wyse, 2012).…”
Section: Methodsmentioning
confidence: 99%
“…, where p(s) is a user-specified prior model probability and p(y | s) is the so-called integrated likelihood (or marginal likelihood, or evidence) for model s. It can be computed using standard methods, either via closed-form expressions (when available), Laplace approximations (Kass et al, 1990), approximate Laplace approximations (Rossell et al, 2021) when n or p are large, or Markov Chain Monte Carlo (MCMC) methods (Friel & Wyse, 2012).…”
Section: Bayesian Model Selection and Averagingmentioning
confidence: 99%
“…In this work, we consider the alternative formula described in the supplementary material S.1. of [16] as it offers better computational stability while inverting the Hessian under the independent prior in (7).…”
Section: Approximate Laplace Approximationmentioning
confidence: 99%
“…, where p(s) is a user-specified prior model probability and p(y | s) is the so-called integrated likelihood (or marginal likelihood, or evidence) for model s. It can be computed using standard methods, either via closed-form expressions (when available), Laplace approximations (Kass et al, 1990), approximate Laplace approximations (Rossell et al, 2021) when n or p are large, or Markov Chain Monte Carlo (MCMC) methods (Friel and Wyse, 2012).…”
Section: Bayesian Model Selection and Averagingmentioning
confidence: 99%