2019
DOI: 10.1109/tpami.2018.2889774
|View full text |Cite
|
Sign up to set email alerts
|

Advances in Variational Inference

Abstract: Many modern unsupervised or semi-supervised machine learning algorithms rely on Bayesian probabilistic models. These models are usually intractable and thus require approximate inference. Variational inference (VI) lets us approximate a high-dimensional Bayesian posterior with a simpler variational distribution by solving an optimization problem. This approach has been successfully applied to various models and large-scale applications. In this review, we give an overview of recent trends in variational infere… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
376
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 511 publications
(377 citation statements)
references
References 93 publications
1
376
0
Order By: Relevance
“…Graves [21] proposed a stochastic method for variational inference with a diagonal Gaussian posterior that can be applied to almost any differentiable log-loss parametric model, including neural networks. However, there is always a trade-off between complexity of the posterior and scalability and robustness [81]. In this work, we adopt the mean-field variational inference [39].…”
Section: Bayesian Neural Networkmentioning
confidence: 99%
“…Graves [21] proposed a stochastic method for variational inference with a diagonal Gaussian posterior that can be applied to almost any differentiable log-loss parametric model, including neural networks. However, there is always a trade-off between complexity of the posterior and scalability and robustness [81]. In this work, we adopt the mean-field variational inference [39].…”
Section: Bayesian Neural Networkmentioning
confidence: 99%
“…One can then evaluate the posterior 289 probability of discrete lexical, prosody and speaker states, using the respective likelihood of the (Kim,290 Frisina et al) parameter estimates (and any priors over discrete states should they be available). This MAP 291 scheme can be read in the spirit of predictive coding that has been amortised (Zhang, Butepage et al 2018). 292…”
Section: Model Inversion or Word Recognition 280mentioning
confidence: 99%
“…The above models were implemented using the Python-based probabilistic programming language PyMC3 v3.8 [36] and inference was conducted using Automatic Differentiation Vari-ational Inference (ADVI; [37]), instead of developing bespoke estimation algorithms, which is a rather laborious process particularly when multiple candidate models are considered [38][39][40]. Variational inference (VI; [41,42]) is a computationally efficient approach for Bayesian inference, which aims to approximate the posterior density p(z|y) of latent variables z given data y using a surrogate probability density q θ (z) parametrised by a vector of variational parameters θ. In our case, the data y are the locus-and sample-specific read counts r ij and R ij , the local copy numbers D ij , the sample-specific purities ρ j and the sample collection times t j , while the latent variables z are the cancer cell fractions φ jk , the cluster weights w k , the amplitudes h 2 k , the time-scales τ k and the sample-specific dispersions v j .…”
Section: Inferencementioning
confidence: 99%