2018
DOI: 10.1080/10618600.2017.1390472
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian Variational Approximation With a Factor Covariance Structure

Abstract: Variational approximation methods have proven to be useful for scaling Bayesian computations to large data sets and highly parametrized models. Applying variational methods involves solving an optimization problem, and recent research in this area has focused on stochastic gradient ascent methods as a general approach to implementation.Here variational approximation is considered for a posterior distribution in high dimensions using a Gaussian approximating family. Gaussian variational approximation with an un… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
71
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 57 publications
(71 citation statements)
references
References 29 publications
0
71
0
Order By: Relevance
“…This means that for the Leukaemia and Breast datasets the dimension of θ is 14,242 so these are examples with a high dimensional parameter. These data were also considered in Ong et al (2017a) where slow convergence in the variational optimization was observed using their method; we show here that a natural gradient approach offers a significant improvement. Ong et al (2017a) and run VAFC with f =4 factors and use only a single sample to estimate the gradient of lower bound (S = 1) in two methods.…”
Section: Cancer Data: High-dimensional Logistic Regression Using the mentioning
confidence: 90%
See 2 more Smart Citations
“…This means that for the Leukaemia and Breast datasets the dimension of θ is 14,242 so these are examples with a high dimensional parameter. These data were also considered in Ong et al (2017a) where slow convergence in the variational optimization was observed using their method; we show here that a natural gradient approach offers a significant improvement. Ong et al (2017a) and run VAFC with f =4 factors and use only a single sample to estimate the gradient of lower bound (S = 1) in two methods.…”
Section: Cancer Data: High-dimensional Logistic Regression Using the mentioning
confidence: 90%
“…Here the variational optimization is challenging because of the strong dependence between local variance parameters and the corresponding coefficients. Using three real datasets we show that the natural gradient estimation method improves the performance of the approach described in Ong et al (2017a). Let y i ∈{0,1} be a binary response with the corresponding covariates x i =(x i1 ,...,x ip ) , i = 1,...,n. We consider the logistic regression model…”
Section: Cancer Data: High-dimensional Logistic Regression Using the mentioning
confidence: 97%
See 1 more Smart Citation
“…Successful application of variational methods requires the VA to be computationally and analytically tractable, and an appropriate transformation needs to exist for the re-parameterization trick to be used. Following Ong et al (2018), the Gaussian approximation q λ (ϑ) = φ(ϑ; µ, Υ) with a parsimonious factor covariance structure meets both conditions. Here, Υ = ΨΨ + ∆ 2 , where Ψ is a full rank p ϑ × K matrix with K p ϑ , d = (d 1 , .…”
Section: Approximate Estimationmentioning
confidence: 94%
“…For uniqueness, it is common to also assume Ψ i,i = 1, although we do not because the lack of uniqueness does not hinder the optimization, and the unconstrained parametrization is more convenient. To apply the re-parameterization Ong et al (2018) show that:…”
Section: Approximate Estimationmentioning
confidence: 99%