2016
DOI: 10.1080/10618600.2014.998336
|View full text |Cite
|
Sign up to set email alerts
|

Online Variational Bayes Inference for High-Dimensional Correlated Data

Abstract: High-dimensional data with hundreds of thousands of observations are becoming commonplace in many disciplines. The analysis of such data poses many computational challenges, especially when the observations are correlated over time and/or across space. In this paper we propose flexible hierarchical regression models for analyzing such data that accommodate serial and/or spatial correlation. We address the computational challenges involved in fitting these models by adopting an approximate inference framework. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…In such cases, the computational cost of simulating from p(θ|y), via MCMC for example, may simply be prohibitive, given the need to both explore a high-dimensional and complex parameter space and -at each point in that search -evaluate p(y|θ) at y. In contrast, the variational family Q, and the optimization algorithm, can be chosen in such a way that a VB approximation of p(θ|y) can be produced within an acceptable timeframe, even when the dimension of θ is in the thousands, or the tens of thousands (Braun and McAuliffe, 2010;Kabisa et al, 2016;Wand, 2017;Koop and Korobilis, 2018). The ability of VB to scale to large models and datasets also makes the method particularly suitable for exploring multiple models quickly, perhaps as a preliminary step to a more targeted analysis (Blei et al, 2017).…”
Section: Variational Bayes (Vb)mentioning
confidence: 99%
“…In such cases, the computational cost of simulating from p(θ|y), via MCMC for example, may simply be prohibitive, given the need to both explore a high-dimensional and complex parameter space and -at each point in that search -evaluate p(y|θ) at y. In contrast, the variational family Q, and the optimization algorithm, can be chosen in such a way that a VB approximation of p(θ|y) can be produced within an acceptable timeframe, even when the dimension of θ is in the thousands, or the tens of thousands (Braun and McAuliffe, 2010;Kabisa et al, 2016;Wand, 2017;Koop and Korobilis, 2018). The ability of VB to scale to large models and datasets also makes the method particularly suitable for exploring multiple models quickly, perhaps as a preliminary step to a more targeted analysis (Blei et al, 2017).…”
Section: Variational Bayes (Vb)mentioning
confidence: 99%
“…Background material regarding Bayesian analysis of DPM models, including many references and detailed discussions relating to MCMC-based techniques for sampling from the relevant posterior, is given in Müller et al (2015). Online VB-based inference for DPMs has been established using a Mean Field approach -see, e.g., Hoffman et al (2010), Wang et al (2011), andKabisa et al (2016)). In contrast, our approach incorporates SVB, which allows for greater flexibility regarding the form of the approximating posterior distribution.…”
Section: A Hierarchical Modelmentioning
confidence: 99%
“…this is equivalent to maximizing an evidence lower bound on the log marginal likelihood. Variational Bayes is widely applied due to the availability of analytic coordinate ascent updates for models with conditionally conjugate priors (Braun and McAuliffe, 2010;Durante and Rigon, 2019;Ray and Szabo, 2022) and its scalability to massive data sets via subsampling (Hoffman et al, 2013;Kabisa et al, 2016). Frequentist consistency and asymptotic normality of variational Bayes have been established (Wang and Blei, 2019).…”
Section: Introductionmentioning
confidence: 99%