2022
DOI: 10.1109/tcsvt.2022.3162575
|View full text |Cite
|
Sign up to set email alerts
|

Efficient and Robust MultiView Clustering With Anchor Graph Regularization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 39 publications
(9 citation statements)
references
References 56 publications
0
9
0
Order By: Relevance
“…Specifically, sparse representation clustering (SSC) [11] is used for each view independently to generate a sparse self-representation matrix and reports the best performance; Low-rank representation clustering (LRR) [12] is used for each view independently to generate a low-rank self-representation matrix and reports the best performance; Robust multi-view spectral clustering (RMSC) [48] recovers a shared low-rank transition probability matrix by Markov chain method. Diversity-induced clustering (DiMSC) [49] explores the complementarity of views by diversity term; Graph learning clustering (GMC) [55] fuses data graph matrices of all views to generate a unified graph matrix; Latent subspace clustering (LMSC) [36] learns latent low-rank representation by mapping the original space into a low-dimensional latent [55] 0.381±0.000 0.519±0.000 0.191±0.000 0.281±0.000 0.174±0.000 0.732±0.000 LMSC [36] 0.563±0.000 0.525±0.000 0.397±0.000 0.440 ±0.000 0.430±0.000 0.450±0.000 MLAN [56] 0.331±0.000 0.475±0.000 0.151±0.000 0.248±0.000 0.150±0.000 0.731±0.000 tSVDMSC [16] 0.812±0.007 0.858±0.007 0.771±0.003 0.788±0.001 0.743±0.006 0.839±0.003 ETLMSC [28] 0.878±0.000 0.902±0.000 0.851±0.000 0.862±0.000 0.848±0.000 0.877±0.000 LRTG [57] 0.615±0.016 0.657±0.005 0.486±0.016 0.525±0.014 0.485±0.023 0.572±0.005 GNLTA [18] 0.881±0.043 0.895±0.013 0.850±0.032 0.861±0.030 0.846±0.041 0.876±0.020 ERMC-AGR [45] 0.430±0.023 0.430±0.012 0.259±0.007 0.315±0.007 0.289±0.005 0.347±0.013 RLMSC [51] 0.551±0.000 0.730±0.000 0.477±0.000 0.513±0.000 0.516±0.000 0.511±0.000…”
Section: B Comparison With State-of-the-art Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Specifically, sparse representation clustering (SSC) [11] is used for each view independently to generate a sparse self-representation matrix and reports the best performance; Low-rank representation clustering (LRR) [12] is used for each view independently to generate a low-rank self-representation matrix and reports the best performance; Robust multi-view spectral clustering (RMSC) [48] recovers a shared low-rank transition probability matrix by Markov chain method. Diversity-induced clustering (DiMSC) [49] explores the complementarity of views by diversity term; Graph learning clustering (GMC) [55] fuses data graph matrices of all views to generate a unified graph matrix; Latent subspace clustering (LMSC) [36] learns latent low-rank representation by mapping the original space into a low-dimensional latent [55] 0.381±0.000 0.519±0.000 0.191±0.000 0.281±0.000 0.174±0.000 0.732±0.000 LMSC [36] 0.563±0.000 0.525±0.000 0.397±0.000 0.440 ±0.000 0.430±0.000 0.450±0.000 MLAN [56] 0.331±0.000 0.475±0.000 0.151±0.000 0.248±0.000 0.150±0.000 0.731±0.000 tSVDMSC [16] 0.812±0.007 0.858±0.007 0.771±0.003 0.788±0.001 0.743±0.006 0.839±0.003 ETLMSC [28] 0.878±0.000 0.902±0.000 0.851±0.000 0.862±0.000 0.848±0.000 0.877±0.000 LRTG [57] 0.615±0.016 0.657±0.005 0.486±0.016 0.525±0.014 0.485±0.023 0.572±0.005 GNLTA [18] 0.881±0.043 0.895±0.013 0.850±0.032 0.861±0.030 0.846±0.041 0.876±0.020 ERMC-AGR [45] 0.430±0.023 0.430±0.012 0.259±0.007 0.315±0.007 0.289±0.005 0.347±0.013 RLMSC [51] 0.551±0.000 0.730±0.000 0.477±0.000 0.513±0.000 0.516±0.000 0.511±0.000…”
Section: B Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…Scene-15 WETMSC 0.904±0.011 0.929±0.007 0.891±0.009 0.899±0.008 0.887±0.011 0.911±0.006 SSC [11] 0.420±0.015 0.723±0.005 0.303±0.011 0.317±0.012 0.441±0.025 0.248±0.010 LRR [12] 0.510±0.009 0.728±0.014 0.304±0.017 0.339±0.008 0.627±0.012 0.231±0.010 RMSC [48] 0.346±0.036 0.573±0.047 0.246±0.031 0.258±0.027 0.457±0.033 0.182±0.031 DiMSC [49] 0.351±0.000 0.589±0.000 0.226±0.000 0.253±0.000 0.362±0.000 0.191±0.000 LT-MSC [15] 0.559±0.012 0.788±0.005 0.393±0.007 0.403±0.003 0.670±0.009 0.288±0.012 GMC [55] 0.331±0.000 0.544±0.000 0.031±0.000 0.081±0.000 0.044±0.000 0.470±0.000 LMSC [36] 0.566±0.012 0.818±0.004 0.383±0.010 0.392±0.010 0.710±0.014 0.271±0.008 MLAN [56] 0.579±0.024 0.748±0.020 0.222±0.015 0.265±0.015 0.173±0.009 0.560±0.016 tSVDMSC [16] 0.607±0.005 0.858±0.003 0.430±0.005 0.440±0.010 0.742±0.007 0.323±0.009 ETLMSC [28] 0.639±0.019 0.899±0.007 0.456±0.017 0.465±0.017 0.825±0.029 0.324±0.012 LRTG [57] 0.490±0.000 0.750±0.000 0.340±0.000 0.350±0.000 0.547±0.000 0.260±0.000 GNLTA [18] 0.604±0.016 0.875±0.005 0.444±0.017 0.453±0.016 0.776±0.018 0.320±0.015 ERMC-AGR [45] 0.169±0.000 0.307±0.000 0.153±0.000 0.179±0.000 0.166±0.000 0.194±0.000 RLMSC [51] 0.512±0.000 0.837±0.000 0.419±0.000 0.429±0.000 0.669±0.000 0.316±0.000 Caltech101 WETMSC 0.673±0.028 0.902±0.018 0.497±0.033 0.500±0.025 0.817±0.029 0.360±0.029 space. Adaptive neighbors clustering (MLAN) [56] adaptively updates the graph for MVC; Anchor graph regularization (ERMC-AGR) [45] learns the embedded anchor graph under matrix factorization framework and introduces CIM to encode noise. Robust localized multi-view subspace clustering (RLMSC) [51] learns the robust consensus representation by fusing the noiseless structures of views.…”
Section: B Comparison With State-of-the-art Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Since anchors can substantially reduce the time complexity and spatial complexity of graph construction, it has been widely used in graph-based large-scale data processing since Liu et al (2010) proposed it (Liu et al 2010;Guo and Ye 2019;Deng et al 2016;Yang et al 2020;Chen and Cai 2011;Cai and Chen 2014;Li et al 2015Li et al , 2016Hong et al 2023;Han et al 2017). Especially after the improvement of anchors-based methods proposed by Nie et al (2016b) and Wang et al (2016) later, there are more and more large-scale multiview data clustering algorithms proposed based on anchors (Shen et al 2022;He et al 2020He et al , 2019Shi et al 2021a;Zhang and Sun 2022;Sun and Zhang 2022;Affeldt et al 2020;Qiang et al 2021;Zhang et al 2020a;Yang et al 2022a, b;Wang et al 2019b;Shu et al 2022). To facilitate the understanding, this section first presents the concept of graph construction, then introduces the anchors-based method and its improvement based on the problems in graph construction, and finally gives a brief description of the anchors-based algorithms.…”
Section: Anchors-based Graph Construction Based Lsmvcmentioning
confidence: 99%