2018
DOI: 10.1080/00949655.2018.1554659
|View full text |Cite
|
Sign up to set email alerts
|

Robust inference for parsimonious model-based clustering

Abstract: We introduce a robust clustering procedure for parsimonious model-based clustering. The classical mclust framework is robustified through impartial trimming and eigenvalue-ratio constraints (the tclust framework, which is robust but not affine invariant). An advantage of our resulting mtclust approach is that eigenvalue-ratio constraints are not needed for certain model formulations, leading to affine invariant robust parsimonious clustering. We illustrate the approach via simulations and a benchmark real data… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…However, the use of the PLR decomposition of an orthogonal matrix is not limited to CPCA, and other statistical models may benefit from its use. Indeed, the PLR decomposition may be used to simplify the ML estimation of the orthogonal matrix related, only to mention a few, to: CPCA based on further non-normal distributions for the groups, other multiple group models allowing for common covariance struc-tures (Flury 1986a;Greselin and Punzo 2013), parsimonious model-based clustering, classification and discriminant analysis (Banfield and Raftery 1993;Flury et al 1994;Celeux and Govaert 1995;Fraley and Raftery 2002;Andrews and McNicholas 2012;Bagnato et al 2014;Lin 2014;Vrbik and McNicholas 2014;Dang et al 2015;Punzo et al 2018;Dotto and Farcomeni 2019), and sophisticated multivariate distributions (Forbes and Wraith 2014;Punzo and Tortora 2019). We pursue to handle these possibilities in future works.…”
Section: Discussionmentioning
confidence: 99%
“…However, the use of the PLR decomposition of an orthogonal matrix is not limited to CPCA, and other statistical models may benefit from its use. Indeed, the PLR decomposition may be used to simplify the ML estimation of the orthogonal matrix related, only to mention a few, to: CPCA based on further non-normal distributions for the groups, other multiple group models allowing for common covariance struc-tures (Flury 1986a;Greselin and Punzo 2013), parsimonious model-based clustering, classification and discriminant analysis (Banfield and Raftery 1993;Flury et al 1994;Celeux and Govaert 1995;Fraley and Raftery 2002;Andrews and McNicholas 2012;Bagnato et al 2014;Lin 2014;Vrbik and McNicholas 2014;Dang et al 2015;Punzo et al 2018;Dotto and Farcomeni 2019), and sophisticated multivariate distributions (Forbes and Wraith 2014;Punzo and Tortora 2019). We pursue to handle these possibilities in future works.…”
Section: Discussionmentioning
confidence: 99%
“…However, the use of the PLR decomposition of an orthogonal matrix is not limited to CPCA, and other statistical models may benefit from its use. Indeed, the PLR decomposition may be used to simplify the ML estimation of the orthogonal matrix related, only to mention a few, to: CPCA based on further non-normal distributions for the groups, other multiple group models allowing for common covariance structures (Flury, 1986a andPunzo, 2013), parsimonious model-based clustering, classification and discriminant analysis (Banfield and Raftery, 1993, Flury et al, 1994, Celeux and Govaert, 1995, Fraley and Raftery, 2002, Andrews and McNicholas, 2012, Bagnato et al, 2014, Lin, 2014, Vrbik and McNicholas, 2014, Dang et al, 2015, Dotto and Farcomeni, 2019, extensions of hidden Markov models Maruotti, 2016 and, and so-phisticated multivariate distributions (Forbes andWraith, 2014 andTortora, 2018). We pursue to handle these possibilities in future works.…”
Section: Discussionmentioning
confidence: 99%
“…Notice that the constraint in ( 8) is still needed whenever either the shape or the volume is free to vary across components (García-Escudero et al, 2017), that is for all models in Table 1 that present "Required" entry in the ER column. The considered approach is the (semi)supervised version of the methodology proposed in Dotto and Farcomeni (2019), which is framed in a completely unsupervised scenario. Feasible and computationally efficient algorithms for enforcing the eigen-ratio constraint for different patterned models are reported in the Appendix C.…”
Section: Model Formulationmentioning
confidence: 99%