2023
DOI: 10.1609/aaai.v37i8.26201
|View full text |Cite
|
Sign up to set email alerts
|

Auto-Weighted Multi-View Clustering for Large-Scale Data

Abstract: Multi-view clustering has gained broad attention owing to its capacity to exploit complementary information across multiple data views. Although existing methods demonstrate delightful clustering performance, most of them are of high time complexity and cannot handle large-scale data. Matrix factorization-based models are a representative of solving this problem. However, they assume that the views share a dimension-fixed consensus coefficient matrix and view-specific base matrices, limiting their representabi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 20 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…The original multi-view data is factorized to learn a unified discrete indicator matrix containing the clustering results. Wan et al (2023b) and Wan et al (2023a) considered that the dimension of different view data may be inconsistent, and the fixed-dimension consensus matrix learned based on matrix factorization will destroy the exploration of complementary information. They proposed to first learn multiple dimension embedding matrices for each view based on matrix factorization.…”
Section: Algorithms Based On Matrix Factorizationmentioning
confidence: 99%
See 1 more Smart Citation
“…The original multi-view data is factorized to learn a unified discrete indicator matrix containing the clustering results. Wan et al (2023b) and Wan et al (2023a) considered that the dimension of different view data may be inconsistent, and the fixed-dimension consensus matrix learned based on matrix factorization will destroy the exploration of complementary information. They proposed to first learn multiple dimension embedding matrices for each view based on matrix factorization.…”
Section: Algorithms Based On Matrix Factorizationmentioning
confidence: 99%
“…They proposed to first learn multiple dimension embedding matrices for each view based on matrix factorization. Then Wan et al (2023b) obtained the final consensus matrix through projection matrix and the final result is obtained by using k-means on the consensus matrix. However, Wan et al (2023a) unified the new k-means to the optimization objective to obtain clustering results directly (Pei et al 2022).…”
Section: Algorithms Based On Matrix Factorizationmentioning
confidence: 99%
“…By treating samples as nodes and relationships between samples as edges, GNNs can easily capture the underlying relationships and rules between samples through message propagation mechanisms, which are suitable to various types of graphs [9,26,38,41,43,44]. GNNs have gained significant popularity and are widely employed in various real-world applications, including recommendation [81], community discovery [25,50], fake news detection [29,85], multi-view clustering [24,74,78,92], bioinformatics [22], hyper-graph analysis [82], image processing [27,30], etc, because they can find the relationship between samples in changing and multivariate data [28,75,88].…”
Section: Temporal Graph Learningmentioning
confidence: 99%