2020
DOI: 10.1109/access.2020.2981740
|View full text |Cite
|
Sign up to set email alerts
|

A Nonconvex Implementation of Sparse Subspace Clustering: Algorithm and Convergence Analysis

Abstract: Subspace clustering has been widely applied to detect meaningful clusters in high-dimensional data spaces. And the sparse subspace clustering (SSC) obtains superior clustering performance by solving a relaxed 0-minimization problem with 1-norm. Although the use of 1-norm instead of the 0 one can make the object function convex, it causes large errors on large coefficients in some cases. In this paper, we study the sparse subspace clustering algorithm based on a nonconvex modeling formulation. Specifically, we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 34 publications
0
8
0
Order By: Relevance
“…The Alternating Direction method of Multipliers (ADMM) [18] is used to solve the sparse subspace clustering algorithm based on non-convex φ  energy functional.…”
Section: Solution Of the Non-convex Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The Alternating Direction method of Multipliers (ADMM) [18] is used to solve the sparse subspace clustering algorithm based on non-convex φ  energy functional.…”
Section: Solution Of the Non-convex Algorithmmentioning
confidence: 99%
“…Least Squares Regression (LSR) [16] uses Fnorm to minimize the representation coefficient to obtain the representation coefficient matrix, which can group highly correlated data together.Re-weighted Sparse Subspace Clustering (RSSC) [17] adds weights to the representation coefficients of the 1 l -norm constraint to reconstruct the representation coefficient matrix, which is closer to the 0 l -norm than the traditional 1 l -norm. Non-convex applications of sparse subspace clustering [18] performs clustering by solving the representation coefficients using , q l  to replace the 0 l -norm constraint objective function. Although the 1 l -norm is used as the constraint penalty term to make the objective function convex, it can not accurately describe the sparsity of coefficient matrix.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, we might be interested in finding a good nonconvex surrogate for such general nonconvex problems. Prior works support the superiority of nonconvex sparsity-promoting surrogates [9], [12], [26]- [31].…”
Section: Analysis On the L0-norm Approximation A Desirable Critementioning
confidence: 99%
“…The most representative ones are sparse subspace clustering [10] (SSC, pursuing a sparse coefficient matrix) and low-rank subspace clustering [11] (LRR, pursuing a low-rank coefficient matrix). Thereafter, many extensions have been developed to improve these methods, such as sparse subspace clustering algorithm based on a nonconvex modeling formulation [12] and sketchbased subspace clustering [13]. FIGURE 1: The flowchart of our method.…”
Section: Introductionmentioning
confidence: 99%