2022
DOI: 10.1109/tcyb.2021.3062830
|View full text |Cite
|
Sign up to set email alerts
|

Adversarial Incomplete Multiview Subspace Clustering Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 40 publications
(8 citation statements)
references
References 51 publications
0
8
0
Order By: Relevance
“…For example, Zhao et al [30] developed a novel incomplete multi-view clustering method, which projects all multi-view data to a complete and unified representation with the constrain of intrinsic geometric structure. Wang et al [32] and Xu et al [33] proposed incomplete multi-view clustering methods based on generative adversarial network (GAN) [34], aiming at two-view and multi-view data respectively, i.e., they use existing views to generate missing views. Lin et al [35] introduced the ideal of contrast prediction from self-supervised learning in to incomplete multi-view learning to explore the consensus between views by maximizing mutual information and minimizing conditional entropy.…”
Section: Deep Neural Network Based Methods [29][30][31][32]mentioning
confidence: 99%
“…For example, Zhao et al [30] developed a novel incomplete multi-view clustering method, which projects all multi-view data to a complete and unified representation with the constrain of intrinsic geometric structure. Wang et al [32] and Xu et al [33] proposed incomplete multi-view clustering methods based on generative adversarial network (GAN) [34], aiming at two-view and multi-view data respectively, i.e., they use existing views to generate missing views. Lin et al [35] introduced the ideal of contrast prediction from self-supervised learning in to incomplete multi-view learning to explore the consensus between views by maximizing mutual information and minimizing conditional entropy.…”
Section: Deep Neural Network Based Methods [29][30][31][32]mentioning
confidence: 99%
“…With the development of deep learning, a number of recent studies (Zhang et al 2020;Lin et al 2021) employ DNNs to solve incomplete multi-view clustering challenges, showing noticeable improvement on clustering performance. Based on producing missing data or not, most existing deep incomplete multi-view clustering methods could be roughly classified into two categories, i.e., learning a uniform representation without generating missing data (Wen et al 2020b,a;Wang et al 2021b) and generating missing instances using adversarial learning (Wang et al 2021a(Wang et al , 2018Zhang et al 2020;Xu et al 2021). The first class of methods reduces the negative influence of missing views by modifying the multi-view fusion representation learning model.…”
Section: Related Workmentioning
confidence: 99%
“…CLUSTERING RESULTS AND STANDARD DEVIATIONS OF THE DIFFERENT METHODS WITH VARIOUS MISSING RATIOS ON SIX MULTIVIEWDATASETS. Agg 91.97±0.00 90.22±0.00 89.64±0.00 81.75±0.00 77.96±0.00 74.75±0.00 72.15±0.00 56.15±0.00 91.99±0.00 90.2±0.00 89.61±0.00 81.48±0.00 DAIMC 81.02±0.00 75.18±0.00 73.86±0.00 68.46±0.00 65.3±0.00 63.01±0.00 48.66±0.0042.98±0.00 81.71±0.00 76.57±0.00 75.35±0.00 69.92±0.00 SRSC 91.11±0.05 91.1±0.00 89.2±0.00 82.63±0.00 75.71±0.10 75.46±0.00 70.81±0.00 56.73±0.00 91.11±0.05 91.07±0.00 89.19±0.00 82.74±0.00 GIMC 83.94±0.00 82.04±0.00 82.92±0.00 77.23±0.00 67.7±0.00 63.27±0.00 64.89±0.00 58.56±0.00 83.53±0.00 81.86±0.00 82.71±0.00 77.78±0.00 EE-IMVC 70.02±4.74 70.89±4.96 68.89±4.37 58.32±2.91 54.07±2.71 53.97±2.91 49.97±2.60 39.54±1.66 72.7±2.69 72.87±2.88 70.61±2.68 64.14±0.78 ANIMC 72.85±2.99 50.5±4.22 39.74±3.49 34.41±2.83 51.42±3.21 28.01±3.54 15.83±2.23 11.1±2.76 73.15±2.87 55.14±3.07 45.91±1.62 37.IMVC 59.2±1.92 58.63±1.29 52.67±1.64 48.63±1.06 57.33±1.20 55.12±0.91 49.89±0.84 43.29±0.62 62.64±1.93 61.19±1.18 55.03±1.46 51.12±1.16 ANIMC 45.97±1.54 37.93±1.24 34.32±1.41 31.94±2.23 47.04±1.81 36.62±1.25 32.92±0.57 29.32±1.37 49.77±2.22 41.13±1.34 37.02±1.71 34.78±2.46 ASR 68.27±0.19 64.59±0.17 62.07±0.26 52.17±0.22 65.63±0.20 61.94±0.16 58.8±0.23 48.1±0.32 71.16±0.17 68.17±0.2 64.78±0.26 55.56±0.19 ProteinFold SSC Agg 25.99±1.09 24.42±0.78 24.65±0.64 19.27±0.79 34.28±0.76 32.35±0.70 32.52±0.72 26.36±0.59 27.82±1.32 26.3±0.85 25.96±0.4 20.95±0.87 DAIMC 28.26±1.43 27.51±1.88 27.28±1.5 24.58±0.92 39.72±0.47 38.94±1.7 37.16±0.97 33.38±0.65 31.47±1.12 30.45±1.86 30.33±1.58 27.27±0.866 SRSC 37.06±1.44 34.04±1.54 34.58±1.43 31.38±0.70 45.97±0.79 44.71±0.69 43.12±0.75 39.13±0.80 39.75±1.30 37.5±1.33 37.01±1.44 33.52±0.83 GIMC 27.52±0.00 24.5±0.00 23.78±0.00 21.18±0.00 37.51±0.00 33.6±0.00 30.88±0.00 29.62±0.00 30.48±0.00 27.4±0.00 27.26±0.00 23.87±0.00 EE-IMVC 33.07±1.9 34.4±2.20 32.13±2.22 29.71±1.82 43.25±0.78 43.63±1.13 41.32±1.56 37.9±0.71 35.68±1.87 36.69±1.97 34.53±2.37 31.54±1.74 ANIMC 16.76±0.79 14.77±0.57 13.62±0.55 12.85±0.43 24.9±1.35 20.54±0.61 19.37±1.02 17.67±0.5 18.6±0.89 16.3±0.61 14.78±0.65 13.49±0.5 ASR 41.43±0.57 41.25±1.19 36.73±1.05 34.64±1.16 50.26±0.57 50.03±0.48 46.69±0.98 42.82±0.81 43.33±0.56 43.59±1.04 39.89±1.09 37.51±1.17 100leaves SSC Agg 80.66±1.19 70.69±1.08 54.15±1.62 33.33±0.72 90.56±0.42 84.71±0.36 74.15±0.56 60.69±0.35 82.02±0.98 73.18±0.87 57.77±1.27 37.8±0.78 DAIMC 77.85±2.02 60.21±1.66 45.69±2.13 31.91±1.16 90.95±1.08 80.7±1.02 69.67±1.14 58.17±0.55 80.79±1.87 64.76±1.64 50.6±1.82 36.96±1.11 SRSC 81.02±1.16 70.71±1.14 51.16±1.48 33.88±0.85 91.94±0.68 84.82±0.41 72.66±0.59 60.93±0.56 83.57±1.14 74.58±1.00 56.33±0.87 38.98±0.78 GIMC 79.63±0.00 66.25±0.00 46.31±0.00 24.44±0.00 91.79±0.00 84.92±0.00 71.43±0.00 53.54±0.00 82.64±0.00 69.55±0.00 49.92±0.00 27.41±0.00 EE-IMVC 74.76±1.03 64.05±1.65 43.68±1.29 31.12±1.05 88.25±0.67 80.6±0.71 67.9±0.62 59.47±0.57 77.03±0.95 66.92±1.56 47.98±1.21 35.45±0.98 ANIMC 65.425±1.79 26.35±0.86 21.9±0.75 17.9±0.80 83.41±0.81 54.46±0.570 48.6±0.71 43.02±0.51 08.48±1.42 30.57±0.73 26.33±0.83 23.35±0.75 ASR 86.41±1.09 78.04±0.73 56.31±0.81 40.52±0.32 95.14±0.49 89.38±0.42 74.34±0.3 62.65±0.44 88.83±0.92 80.99±0.64 60.94±0.69 46.84±0.47 Agg 25.98±0.51 23.51±0.51 20.63±0.49 18.25±0.37 47.28±0.12 44.63±0.20 41.26±0.19 34.22±0.17 30.76±0.48 28.3±0.67 24.53±0.54 20.93±0.60 DAIMC 26.1±0.68 25.46±0.87 21.15±1.36 18.7±1.05 51.55±0.29 50.57±0.34 43.17±0.7 36.37±0.88 32.85±0.82 31.71±1.1 27.53±1.55 23.91±1.04 SRSC 23.73±0.53 20±0.4 21.9 ±0.53 18.27±0.55 48.47±0.38 42.92±0.21 42.26±0.21 37.01±0.41 29.99±0.57 26.39±0.61 27.65±0.52 23.01±0.69 GIMC 18.11±0.00 16.09±0.00 15.02±0.00 13.14±0.00 39.05±0.00 36.02±0.00 35.1±0.00 30.52±0.00 23±0.00 19.61±0.00 18.24±0.00 16.18±0.00 EE-IMVC 27.12±0.88 26.57±0.57 21.36±0.92 18.9±0.55 51.67±0.33 50.16±0.34 43.56±0.19 36.76±0.34 33.21±1.11 31.61±0.61 27.41±0.87 23.4±0.57 ANIMC 15.21±0.57 9.73±0.09 9.45±0.51 9.46±0.23 35.27±0.35 12.36±0.3...…”
mentioning
confidence: 99%
“…Recently, some IMVC methods have been proposed to alleviate the above problem. From the perspective of clustering techniques, existing IMVC methods are mainly divided into five categories, subspace learning-based methods [41], [43], nonnegative matrix factorization (NMF)-based methods [15], [16], graph learning-based methods [20], [39], multiple kernelbased methods [24], [23] and deep learning-based methods [38], [42], [21], [44], [46]. Wen et al present an extension of a low-rank representation (LRR) model that incorporates feature space-based missing-view inferring and manifold space-based similarity graph learning [41].…”
mentioning
confidence: 99%
See 1 more Smart Citation