2019
DOI: 10.1109/tkde.2018.2858782
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank Sparse Subspace for Spectral Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
52
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 171 publications
(52 citation statements)
references
References 38 publications
0
52
0
Order By: Relevance
“…After optimizing (6), elaborated in Section 2.3, we conduct feature selection [24, 4244, 53] by discarding the regressors (or the response variables) whose corresponding coefficients in B (or A ) are zeros in the rows. More specifically, according to (12), the sparse rows on A imply that their corresponding columns ( i.e.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…After optimizing (6), elaborated in Section 2.3, we conduct feature selection [24, 4244, 53] by discarding the regressors (or the response variables) whose corresponding coefficients in B (or A ) are zeros in the rows. More specifically, according to (12), the sparse rows on A imply that their corresponding columns ( i.e.…”
Section: Methodsmentioning
confidence: 99%
“…This section describes the optimization process of the parameters b , B , and A . Specifically, we iteratively conduct the following three steps until convergence by means of Iteratively Reweighted Least Square (IRLS) [32, 53]: (i) Update b with fixed B and A . (ii) Update B with fixed b and A .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…That is, SSC enforces C to be sparse, LSR tends to group highly correlated data together, while LRR encourages C to be low rank [47]. Actually, most RBSC algorithms so far are static SC algorithms and actually are variants or extensions of SSC, LSR, LRR, or a combination of them (e.g., LSS [48]). The optimal C * obtained by (3) will then be utilized to build an affinity matrix that is denoted as W. The SC results will be obtained after applying spectral clustering algorithms [49]- [52] to W.…”
Section: B Subspace Clusteringmentioning
confidence: 99%
“…Meanwhile some research work proposes to learn and update these graph similarity matrices iteratively to avoid high dimensional and noisy data problem. Reference [19] proposes a Lowrank Sparse Subspace (LSS) clustering method via dynamically learning the affinity matrix from low-dimensional space of the original data. The authors in Ref.…”
Section: Related Workmentioning
confidence: 99%