2019
DOI: 10.1002/andp.201800233
|View full text |Cite
|
Sign up to set email alerts
|

Variational Inference for Stochastic Differential Equations

Abstract: The statistical inference of the state variable and the drift function of stochastic differential equations (SDE) from sparsely sampled observations are discussed herein. A variational approach is used to approximate the distribution over the unknown path of the SDE conditioned on the observations. This approach also provides approximations for the intractable likelihood of the drift. The method is combined with a nonparametric Bayesian approach which is based on a Gaussian process prior over drift functions.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(16 citation statements)
references
References 31 publications
0
16
0
Order By: Relevance
“…While sampling based methods of inference for SDE models do exist [64,65], these are generally not scalable to large datasets or to models with many parameters. Instead, we use an approximate variational inference approach [66,67]. We assume a parametric form of the posterior that is optimized to be close to the true posterior.…”
Section: Variational Approximation For Scalable Bayesian Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…While sampling based methods of inference for SDE models do exist [64,65], these are generally not scalable to large datasets or to models with many parameters. Instead, we use an approximate variational inference approach [66,67]. We assume a parametric form of the posterior that is optimized to be close to the true posterior.…”
Section: Variational Approximation For Scalable Bayesian Inferencementioning
confidence: 99%
“…The functional form of the posterior drift is both more general and more easily trainable than the network SDE in Eq 4, but ultimately is forced to be close to the network dynamics in Eq (4) by the loss function. The loss function for this approach has been previously derived [66,67]. The imputed baseline states x 0 are averaged over.…”
Section: Variational Approximation For Scalable Bayesian Inferencementioning
confidence: 99%
“…In its most general setting of sampling from a target distribution, this formulation was known to Dai Pra (1991). Tzen and Raginsky (2019b) study the theoretical properties of this approach in the context of generative models (Kingma et al, 2021;Goodfellow et al, 2014), finally Opper (2019) applies this formulation to time series modelling. In contrast our focus is on the estimation of a Bayesian posterior for a broader class of models than Tzen and Raginsky explore.…”
Section: Stochastic Control Formulationmentioning
confidence: 99%
“…The theory of information has very practical application in physics and elsewhere. Manfred Opper discusses how the solutions of stochastic differential equations can be inferred via variational methods, which have their origin in thermodynamics . Torsten Enßlin introduces information field theory (IFT), the information theory for fields, as a means to infer the configuration of a physical field from incomplete and noisy data .…”
Section: Informationmentioning
confidence: 99%