2021
DOI: 10.1016/j.neunet.2021.05.026
|View full text |Cite
|
Sign up to set email alerts
|

Spectral embedding network for attributed graph clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(17 citation statements)
references
References 59 publications
0
17
0
Order By: Relevance
“…If no cluster number is specified, COMMO will automatically determine the number of clusters using the k-nearest neighbor algorithm (see Supplementary Section S1.1 ). Second, eight clustering methods are used to identify gene modules, including FLAME (Fuzzy clustering by Local Approximation of Memberships) ( Fu and Medico 2007 ), K-means ( Timmerman et al 2013 ), SOM (self-organizing mapping) ( Wang et al 2002 ), spectral clustering ( Zhang et al 2021 ), Agglomerative ( Liu et al 2022b ), Hclust ( Bu et al 2022 ), NMF (non-negative matrix factorization) ( Liefeld et al 2023 ), and ICA (independent component analysis) ( Hyvärinen 2013 ). Detailed descriptions of these eight methods can be found in the Supplementary Materials .…”
Section: Methodsmentioning
confidence: 99%
“…If no cluster number is specified, COMMO will automatically determine the number of clusters using the k-nearest neighbor algorithm (see Supplementary Section S1.1 ). Second, eight clustering methods are used to identify gene modules, including FLAME (Fuzzy clustering by Local Approximation of Memberships) ( Fu and Medico 2007 ), K-means ( Timmerman et al 2013 ), SOM (self-organizing mapping) ( Wang et al 2002 ), spectral clustering ( Zhang et al 2021 ), Agglomerative ( Liu et al 2022b ), Hclust ( Bu et al 2022 ), NMF (non-negative matrix factorization) ( Liefeld et al 2023 ), and ICA (independent component analysis) ( Hyvärinen 2013 ). Detailed descriptions of these eight methods can be found in the Supplementary Materials .…”
Section: Methodsmentioning
confidence: 99%
“…DAEGC [8] optimizes graph reconstruction loss and a clustering loss jointly. SENet [46] uses a spectral clustering loss to learn node embeddings. GC-VGE [37] introduces a joint framework for clustering and representation learning by utilizing a variational graph embedding mechanism.…”
Section: B Baseline Methodsmentioning
confidence: 99%
“…Influenced by deep learning, graph representation learning is receiving increasing attention from researchers. For example, SENET [12] improved graph structure by fusing common neighbor nodes and aggregated multi-order node features and structural features using kernel functions. LGNN [13] introduced supervised learning into multi-scale graph representation learning.…”
Section: Related Work 21 Graph Representation Learningmentioning
confidence: 99%