Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2020
DOI: 10.1145/3394486.3403196
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Differentiable Multi-aspect Network Embedding

Abstract: Network embedding is an influential graph mining technique for representing nodes in a graph as distributed vectors. However, the majority of network embedding methods focus on learning a single vector representation for each node, which has been recently criticized for not being capable of modeling multiple aspects of a node. To capture the multiple aspects of each node, existing studies mainly rely on offline graph clustering performed prior to the actual embedding, which results in the cluster membership of… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(10 citation statements)
references
References 23 publications
0
10
0
Order By: Relevance
“…The majority of existing network embedding methods only learn one single representation for each node, which is inefficient when modeling multiple facets of nodes. To overcome this issue, recent works [5,18,20] have proposed to focus on obtaining multiple vector representations for each node in the network. For example, based on Deepwalk [21], PolyDeepwalk [18] maximizes the likelihood of obtained observations by performing random walk.…”
Section: Related Workmentioning
confidence: 99%
“…The majority of existing network embedding methods only learn one single representation for each node, which is inefficient when modeling multiple facets of nodes. To overcome this issue, recent works [5,18,20] have proposed to focus on obtaining multiple vector representations for each node in the network. For example, based on Deepwalk [21], PolyDeepwalk [18] maximizes the likelihood of obtained observations by performing random walk.…”
Section: Related Workmentioning
confidence: 99%
“…MCGE [11] applied tensor factorization and defined a multiview kernel tensor to obtain common latent factors that capture the global structure information. Random walks have been applied in network embedding [14,19,[39][40][41][42]. MNE [14] learned two vectors for a node at the same time, i.e., a common vector sharing by all layers and a lower-dimensional vector for an individual layer.…”
Section: Multiplex Network Embeddingmentioning
confidence: 99%
“…Because we can learn the representations of the primary graph by exploiting the information from the auxiliary graphs by embedding transformation. It can generate the representations of primary graph under different contexts (auxiliary graphs), which is essential in heterogeneous graph representation and community detection [10,24,29].…”
Section: Embeddingmentioning
confidence: 99%