2014
DOI: 10.1007/s11634-014-0182-6
|View full text |Cite
|
Sign up to set email alerts
|

Mixture model averaging for clustering

Abstract: In mixture model-based clustering applications, it is common to fit several models from a family and report clustering results from only the 'best' one. In such circumstances, selection of this best model is achieved using a model selection criterion, most often the Bayesian information criterion. Rather than throw away all but the best model, we average multiple models that are in some sense close to the best one, thereby producing a weighted average of clustering results. Two (weighted) averaging approaches … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 51 publications
0
15
0
Order By: Relevance
“…Steele and Raftery 2010). The model averaging approach used by Wei and McNicholas (2015) provides an alternative to the 'single best model' paradigm; however, it too depends on the BIC. Furthermore, as the field moves away from Gaussian mixture models, questions around the efficacy of the BIC for mixture model selection will only grow in their frequency and intensity-although there is theoretical justification for using the BIC to compare non-nested models (cf.…”
Section: Discussionmentioning
confidence: 99%
“…Steele and Raftery 2010). The model averaging approach used by Wei and McNicholas (2015) provides an alternative to the 'single best model' paradigm; however, it too depends on the BIC. Furthermore, as the field moves away from Gaussian mixture models, questions around the efficacy of the BIC for mixture model selection will only grow in their frequency and intensity-although there is theoretical justification for using the BIC to compare non-nested models (cf.…”
Section: Discussionmentioning
confidence: 99%
“…In a parametric clustering framework the idea of combining different models has been developed in order to obtain partitions based on an average of models rather than on a single one. Both the works of Russell et al (2015) and Wei and McNicholas (2015) propose a Bayesian model averaging approach to postprocess the results of model-based clustering. A key issue pointed out in both the proposals consists in the need of selecting an invariant quantity, i.e.…”
Section: Framework and Model Specificationmentioning
confidence: 99%
“…Simulation results supporting the use of the BIC for selecting the number of factors in a factor analysis model are given by Lopes and West (2004). More recent accounts of the advantages and drawbacks of the BIC and some alternatives are given by Maugis et al (2009), Hennig (2010, Wei and McNicholas (2014), and Bhattacharya and McNicholas (2014). A simulation study on the sensitivity of the BIC to the number of components is presented in Section 5.…”
Section: Model Selection and Convergencementioning
confidence: 99%
“…. , f G (x | θ G ) are usually taken to be of the same type and, until quite recently, the Gaussian mixture model has dominated the model-based clustering and classification literature (e.g., McLachlan and Peel, 2000;McLachlan et al, 2003;Bouveyron et al, 2007;McNicholas andMurphy, 2008, 2010;Baek et al, 2010;Montanari and Viroli, 2011;Bhattacharya and McNicholas, 2014;Browne and McNichola 2014;Wei and McNicholas, 2014). The density of a Gaussian mixture model is f (x | ϑ) = G g=1 π g φ(x | µ g , Σ g ), where φ(x | µ g , Σ g ) is the multivariate Gaussian density with mean µ g and covariance matrix Σ g .…”
Section: Introductionmentioning
confidence: 99%