2014
DOI: 10.1016/j.csda.2013.07.008
|View full text |Cite
|
Sign up to set email alerts
|

Parsimonious skew mixture models for model-based clustering and classification

Abstract: In recent work, robust mixture modelling approaches using skewed distributions have been explored to accommodate asymmetric data. We introduce parsimony by developing skew-t and skew-normal analogues of the popular GPCM family that employ an eigenvalue decomposition of a positive-semidefinite matrix. The methods developed in this paper are compared to existing models in both an unsupervised and semi-supervised classification framework. Parameter estimation is carried out using the expectationmaximization algor… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
37
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
10

Relationship

4
6

Authors

Journals

citations
Cited by 87 publications
(37 citation statements)
references
References 56 publications
0
37
0
Order By: Relevance
“…The Matlab software mixmod (Biernacki et al, 2006) and R software mixture (Browne and McNicholas, 2014) implement all of the fourteen models. Recent developments of EDGMM include the incorporation of t distributions to deal with outliers (t-EDGMM; Andrews and McNicholas, 2012), the incorporation of skew distributions to further tackle asymmetry (skew-EDGMM; Vrbik and McNicholas, 2014) and the extension of t-EDGMM in the presence of missing data (Lin, 2014). A central issue in learning EDGMM is to determine a suitable number of mixture components and covariance structure.…”
Section: Introductionmentioning
confidence: 99%
“…The Matlab software mixmod (Biernacki et al, 2006) and R software mixture (Browne and McNicholas, 2014) implement all of the fourteen models. Recent developments of EDGMM include the incorporation of t distributions to deal with outliers (t-EDGMM; Andrews and McNicholas, 2012), the incorporation of skew distributions to further tackle asymmetry (skew-EDGMM; Vrbik and McNicholas, 2014) and the extension of t-EDGMM in the presence of missing data (Lin, 2014). A central issue in learning EDGMM is to determine a suitable number of mixture components and covariance structure.…”
Section: Introductionmentioning
confidence: 99%
“…For non-continuous data, either longitudinal or multi-dimensional, the classification is often performed by mixture modeling. [15][16][17][18] In the case of trajectory classification, each individual trajectory is modeled by a mixture of a finite number of polynomials or spline functions, the mixing proportions varying from one individual to another. Some methods assume that there is no intra-group heterogeneity (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…However this approach still implies that the data are elliptically contoured within each group (Ban eld and Raftery, 1993). To address this issue, mixtures of skew-normal or skew-t distributions can be used (Lin et al, 2007b,a;Cabral et al, 2012;Prates et al, 2013a;Vrbik and McNicholas, 2014). However, these distributions can prove numerically unstable in high-dimensional settings (Fruhwirth-Schnatter and Pyne, 2009).…”
Section: Introductionmentioning
confidence: 99%