2001
DOI: 10.1162/08997660151134299
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Analysis of Mixtures of Factor Analyzers

Abstract: For Bayesian inference on the mixture of factor analyzers (MFA), natural conjugate priors on the parameters are introduced and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes instead of samples from the conditional posteriors used in the Gibbs sampler. This is regarded as a maximum a posteriori (MAP) estimation algorithm with hyperparameter search. The behaviors of the Gibbs sampler and th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2003
2003
2015
2015

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(28 citation statements)
references
References 7 publications
0
28
0
Order By: Relevance
“…An alternative way of proceeding is to adopt some prior distribution for the D i as in the Bayesian approaches of Fokouà e and Titterington (2000), Ghahramani and Beal (2000) and Utsugi and Kumagai (2001). The mixture of probabilistic component analyzers (PCAs) model, as proposed by Tipping and Bishop (1997), has form (6) with each D i now having the isotropic structure are given explicitly by an eigenvalue decomposition of the current value of V i .…”
Section: Maximum Likelihood Estimation Of Mixture Of Factor Analyzersmentioning
confidence: 99%
“…An alternative way of proceeding is to adopt some prior distribution for the D i as in the Bayesian approaches of Fokouà e and Titterington (2000), Ghahramani and Beal (2000) and Utsugi and Kumagai (2001). The mixture of probabilistic component analyzers (PCAs) model, as proposed by Tipping and Bishop (1997), has form (6) with each D i now having the isotropic structure are given explicitly by an eigenvalue decomposition of the current value of V i .…”
Section: Maximum Likelihood Estimation Of Mixture Of Factor Analyzersmentioning
confidence: 99%
“…We believe that a careful study of the limitations noticed so far would lead to better sampling schemes that would then be fully applicable to truly high-dimensional Machine Learning tasks. Finally, we acknowledge the very recent appearance of Utsugi and Kumagai (2001), who set out basic MCMC principles for MFAs along the lines also reported by Fokoué (2000). …”
Section: Discussionmentioning
confidence: 79%
“…1, namely, a mixture of factor analysis, or local FA (including local PCA or local subspaces [1,[32][33][34][35][36]48,[57][58][59][60][61][62][63][64][65]). The implementation can be handled in either of the following two choices:…”
Section: (92)mentioning
confidence: 99%
“…In other words, the role Ψ x applies to a standard local FA or a mixture of FA models [63][64][65][66][67].…”
Section: (92)mentioning
confidence: 99%
See 1 more Smart Citation