2018
DOI: 10.1080/10618600.2017.1377081
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Convex Clustering

Abstract: Convex clustering, a convex relaxation of k-means clustering and hierarchical clustering, has drawn recent attentions since it nicely addresses the instability issue of traditional nonconvex clustering methods. Although its computational and statistical properties have been recently studied, the performance of convex clustering has not yet been investigated in the high-dimensional clustering scenario, where the data contains a large number of features and many of them carry no information about the clustering … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
94
0
1

Year Published

2019
2019
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 58 publications
(96 citation statements)
references
References 27 publications
1
94
0
1
Order By: Relevance
“…Restricting our attention to standard convex clustering (p(·) = · q ), several useful methodological extensions have been proposed in the literature. For example, Wang et al (2016) augment the convex clustering problem (1) with an additional sparse component to add robustness to outliers, similar to the robust PCA formulation of Candès et al (2011), while Wang et al (2018) propose a variant which incorporates feature selection into the clustering objective using an 1 penalty (Tibshirani, 1996). As discussed in Section C, Chi et al The squared Frobenius loss function of the convex clustering problem may be interpreted as an isotropic Gaussian likelihood, suggesting another avenue for generalization.…”
Section: F Additional Related Workmentioning
confidence: 99%
“…Restricting our attention to standard convex clustering (p(·) = · q ), several useful methodological extensions have been proposed in the literature. For example, Wang et al (2016) augment the convex clustering problem (1) with an additional sparse component to add robustness to outliers, similar to the robust PCA formulation of Candès et al (2011), while Wang et al (2018) propose a variant which incorporates feature selection into the clustering objective using an 1 penalty (Tibshirani, 1996). As discussed in Section C, Chi et al The squared Frobenius loss function of the convex clustering problem may be interpreted as an isotropic Gaussian likelihood, suggesting another avenue for generalization.…”
Section: F Additional Related Workmentioning
confidence: 99%
“…As a specific field of triangle lasso, convex clustering has drawn many attentions [2], [3], [5], [6], [7], [8], [9]. [5] proposes a new stochastic incremental algorithm to conduct convex clustering.…”
Section: Convex Clusteringmentioning
confidence: 99%
“…[2] uses an l 2,1 regularization to pick noisy features when conducting convex clustering. [3] investigates to remove sparse outlier or uninformative features when conducting convex clustering. However, both of them uses more than one convex regularized items in the formulation, which needs to tune multiple hyper-parameters in practical scenarios.…”
Section: Convex Clusteringmentioning
confidence: 99%
See 1 more Smart Citation
“…A nice method using ideas similar to the LASSO is ClusterPath [17]. This very interesting and efficient method has been studied and extended in [28], [26] and [32]. One of the main drawbacks of this approach is the lack of a robust rule for the choice of the parameters governing the procedure although they seem to be reasonably easy to tune in practice.…”
Section: Recent Advances In Clusteringmentioning
confidence: 99%