2018
DOI: 10.1609/aaai.v32i1.11621
|View full text |Cite
|
Sign up to set email alerts
|

Reliable Multi-View Clustering

Abstract: With the advent of multi-view data, multi-view learning (MVL) has become an important research direction in machine learning. It is usually expected that multi-view algorithms can obtain better performance than that of merely using a single view. However, previous researches have pointed out that sometimes the utilization of multiple views may even deteriorate the performance. This will be a stumbling block for the practical use of MVL in real applications, especially for tasks requiring high dependability. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…Many clustering methods have been proposed to solve the multi-view clustering task, which can be roughly divided into six categories according to their strategies: multi-kernel clustering [37], [42], co-training style clustering [33], multi-view subspace clustering [12], [5], multi-view spectral clustering [21], [62], deep multi-view clustering, [50], graph-based multi-view clustering [31], [29], [41], [54], [20], which is most related to our works.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many clustering methods have been proposed to solve the multi-view clustering task, which can be roughly divided into six categories according to their strategies: multi-kernel clustering [37], [42], co-training style clustering [33], multi-view subspace clustering [12], [5], multi-view spectral clustering [21], [62], deep multi-view clustering, [50], graph-based multi-view clustering [31], [29], [41], [54], [20], which is most related to our works.…”
Section: Related Workmentioning
confidence: 99%
“…For graph-based multi-view clustering, Nie et al [29] minimized the reconstruction error by learning an optimal weight for each graph to build a unified graph. Tao et al [41] learned a fusion graph by using the information of multi graphs from both view-level and sample-view. Zhan et al [54] learned the cluster indicating matrices to merge the information from singleview graphs, and then minimized disagreement between each individual view and the global view to obtain a consensus graph.…”
Section: Related Workmentioning
confidence: 99%
“…Neighbor Clustering (ANNC); and the internal evaluation Index based on Rawls' Max-Min criterion (Kameda et al, 2016) (MMI). Given extracting information from each single view is also important (Tao et al, 2018), FDEW aims to learn a fusion distance matrix set, which not only uses complementary information among multiple views, but exploits the information from each single view. ANNC obeys an intuitive rule that one cluster and its nearest neighbor with higher mass (size) should be grouped into the same cluster in the clustering process (Yang and Lin, 2020).…”
Section: Introductionmentioning
confidence: 99%