Message passing on a factor graph is a powerful paradigm for the coding of approximate inference algorithms for arbitrarily graphical large models. The notion of a factor graph fragment allows for compartmentalization of algebra and computer code. We show that the Inverse G-Wishart family of distributions enables fundamental variational message passing factor graph fragments to be expressed elegantly and succinctly. Such fragments arise in models for which approximate inference concerning covariance matrix or variance parameters is made, and are ubiquitous in contemporary statistics and machine learning.
We extend recent work concerning variational approximations via message passing to accommodate approximate fitting and inference for skew t regression models. Derivation of variational message passing is challenging owing to the presence of non-standard exponential families and numerical integration being needed. Nevertheless, the factor graph fragment approach means that algorithm updates only need to be derived once for a particular response model, which can be integrated in an arbitrarily complex model. Another advantage of our work is that all skew t parameters are inferred, rather than being held fixed. Furthermore, we show that posterior dependence arising in an auxiliary variable representation of a skew t model may lead to poor performances in terms of variational message passing approximation when using simple auxiliary variable representations of the likelihood fragment and convenient factorizations of the approximating densities.
Structural equation models (SEMs) are commonly used to study the structural relationship between observed variables and latent constructs. Recently, Bayesian fitting procedures for SEMs have received more attention thanks to their potential to facilitate the adoption of more flexible model structures, and variational approximations have been shown to provide fast and accurate inference for Bayesian analysis of SEMs. However, the application of variational approximations is currently limited to very simple, elemental SEMs. We develop mean-field variational Bayes algorithms for two SEM formulations for data that present non-Gaussian features such as skewness and multimodality. The proposed models exploit the use of mixtures of Gaussians, include covariates for the analysis of latent traits and consider missing data. We also examine two variational information criteria for model selection that are straightforward to compute in our variational inference framework. The performance of the MFVB algorithms and information criteria is investigated in a simulated data study and a real data application.
Message passing on a factor graph is a powerful paradigm for the coding of approximate inference algorithms for arbitrarily large graphical models. The notion of a factor graph fragment allows for compartmentalisation of algebra and computer code. We show that the Inverse G-Wishart family of distributions enables fundamental variational message passing factor graph fragments to be expressed elegantly and succinctly. Such fragments arise in models for which approximate inference concerning covariance matrix or variance parameters is made, and are ubiquitous in contemporary statistics and machine learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.