2023
DOI: 10.1609/aaai.v37i9.26285
|View full text |Cite
|
Sign up to set email alerts
|

Cluster-Guided Contrastive Graph Clustering Network

Abstract: Benefiting from the intrinsic supervision information exploitation capability, contrastive learning has achieved promising performance in the field of deep graph clustering recently. However, we observe that two drawbacks of the positive and negative sample construction mechanisms limit the performance of existing algorithms from further improvement. 1) The quality of positive samples heavily depends on the carefully designed data augmentations, while inappropriate data augmentations would easily lead to the s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 48 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…The time complexity of the proposed AMGC could be discussed from the network and loss computation aspects. (Liu et al 2023), and CCGC (Yang et al 2023). In addition, we report the performance of two nongraph attribute-missing clustering methods (i.e., IMKAM (Liu 2021) and DIMVC (Xu et al 2022)) and three graph counterparts (i.e., SAT (Chen et al 2022), ITR (Tu et al 2022), and SVGA (Yoo et al 2022)).…”
Section: Computational Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…The time complexity of the proposed AMGC could be discussed from the network and loss computation aspects. (Liu et al 2023), and CCGC (Yang et al 2023). In addition, we report the performance of two nongraph attribute-missing clustering methods (i.e., IMKAM (Liu 2021) and DIMVC (Xu et al 2022)) and three graph counterparts (i.e., SAT (Chen et al 2022), ITR (Tu et al 2022), and SVGA (Yoo et al 2022)).…”
Section: Computational Complexitymentioning
confidence: 99%
“…The key prerequisite for the success of current DGC methods (Bo et al 2020;Cui et al 2020;Tu et al 2021;Peng et al 2021;Gong et al 2022a,b;Liu et al 2023;Hu et al 2023;Yang et al 2022Yang et al , 2023 lies in the assumption that all samples within a graph are trustworthy and complete. However, such an assumption may not always hold in practical scenarios since it is hard to collect all information from graph data.…”
Section: Introductionmentioning
confidence: 99%
“…Graph neural networks (GNNs) are widely available in the real world [37,52,53] and are attracting the attention of researchers [51,56,87,89]. By treating samples as nodes and relationships between samples as edges, GNNs can easily capture the underlying relationships and rules between samples through message propagation mechanisms, which are suitable to various types of graphs [9,26,38,41,43,44].…”
Section: Temporal Graph Learningmentioning
confidence: 99%