2013
DOI: 10.1214/13-sts418
|View full text |Cite
|
Sign up to set email alerts
|

Variational Inference for Generalized Linear Mixed Models Using Partially Noncentered Parametrizations

Abstract: The effects of different parametrizations on the convergence of Bayesian computational algorithms for hierarchical models are well explored. Techniques such as centering, noncentering and partial noncentering can be used to accelerate convergence in MCMC and EM algorithms but are still not well studied for variational Bayes (VB) methods. As a fast deterministic approach to posterior approximation, VB is attracting increasing interest due to its suitability for large high-dimensional data. Use of different para… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
58
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
8

Relationship

4
4

Authors

Journals

citations
Cited by 35 publications
(58 citation statements)
references
References 45 publications
0
58
0
Order By: Relevance
“…The MFVB methods we considered here are fast and versatile and can be easily extended to more complicated scenarios. For example, the methods allow arbitrary priors for the hyperparameters [57] and similar types of model with Gaussian responses [18,20,21]. Stewart [22] provides great examples of more elaborate models within the context of social sciences.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The MFVB methods we considered here are fast and versatile and can be easily extended to more complicated scenarios. For example, the methods allow arbitrary priors for the hyperparameters [57] and similar types of model with Gaussian responses [18,20,21]. Stewart [22] provides great examples of more elaborate models within the context of social sciences.…”
Section: Discussionmentioning
confidence: 99%
“…Armagan and Dunson [19] developed a fast remedy for sparse covariance estimation relying on a decomposition but is restricted to linear response. Tan and Nott [20] extended their approach and introduced a partially noncentered nonparametrization strategy for generalized linear mixed models, allowing the random effects for each group to be independent. Such restriction was shown to improve efficiency of the variational algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…GLMMs are often considered difficult to estimate because of the presence of random effects and lack of conjugate priors. VB schemes for GLMMs were considered previously by Rijmen and Vomlel (2008); Ormerod and Wand (2012) and Tan and Nott (2013b), and shown to have attractive computational and accuracy trade-offs.…”
Section: Accepted Manuscriptmentioning
confidence: 99%
“…Examples of such models include logistic regression (Jaakkola and Jordan, 1997), nonparametric regression with measurement error (Pham et al, 2013), generalized linear mixed models (Tan and Nott, 2013), and correlated topic models (Blei and Lafferty, 2007). Variational approximations of non-conjugate models often have to be derived on a case by case basis and can be difficult to handle.…”
Section: Introductionmentioning
confidence: 99%