2007
DOI: 10.1016/j.neuroimage.2006.08.035
|View full text |Cite
|
Sign up to set email alerts
|

Variational free energy and the Laplace approximation

Abstract: This note derives the variational free energy under the Laplace approximation, with a focus on accounting for additional model complexity induced by increasing the number of model parameters. This is relevant when using the free energy as an approximation to the log-evidence in Bayesian model averaging and selection. By setting restricted maximum likelihood (ReML) in the larger context of variational learning and expectation maximisation (EM), we show how the ReML objective function can be adjusted to provide … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

8
840
1
1

Year Published

2008
2008
2018
2018

Publication Types

Select...
7
3

Relationship

2
8

Authors

Journals

citations
Cited by 824 publications
(869 citation statements)
references
References 38 publications
8
840
1
1
Order By: Relevance
“…In this context, the objective function is the (log) marginal likelihood as approximated by variational free energy. This accounts for the accuracy but also the complexity of the model of temporal correlations (Friston et al, 2007). The free energy provides a lower bound on the model evidence enabling Bayesian model comparison.…”
Section: Methodsmentioning
confidence: 99%
“…In this context, the objective function is the (log) marginal likelihood as approximated by variational free energy. This accounts for the accuracy but also the complexity of the model of temporal correlations (Friston et al, 2007). The free energy provides a lower bound on the model evidence enabling Bayesian model comparison.…”
Section: Methodsmentioning
confidence: 99%
“…A standard variational (Laplace) Bayesian scheme was used to approximate the conditional density over parameters by maximizing a negative free energy bound on log-model evidence (Friston et al, 2007). This inversion was used to assess which models (1-10) best explained the theta and gamma changes.…”
Section: Bayesian Model Inversion and Selectionmentioning
confidence: 99%
“…Here, we revisit questions about the generation of distributed responses by analysing the data using conventional deterministic DCMs (EM), stochastic DCMs under the mean-field approximation (DEM) and generalised filtering (GF). The mathematical details of these schemes are described in a series of technical papers (e.g., EM: Friston et al, 2007;DEM: Friston et al, 2008;GF: Friston et al, 2010). In this paper, we focus on the products of these schemes and how they differ from each other.…”
Section: Stochastic Dcmmentioning
confidence: 99%