2020
DOI: 10.1007/978-3-030-39469-1_11
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian Embedding of Large-Scale Attributed Graphs

Abstract: Graph embedding methods transform high-dimensional and complex graph contents into low-dimensional representations. They are useful for a wide range of graph analysis tasks including link prediction, node classification, recommendation and visualization. Most existing approaches represent graph nodes as point vectors in a low-dimensional embedding space, ignoring the uncertainty present in the real-world graphs. Furthermore, many real-world graphs are large-scale and rich in content (e.g. node attributes). In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 12 publications
(9 reference statements)
0
5
0
Order By: Relevance
“…We visualize the embedding results of our approach in a two-dimensional space by applying t-SNE [42], as shown in While this work has shown promising performance beyond existing works, graph representation learning still faces great challenges due to real-world networks' complexity. Consequently, one of our future works will be focusing on developing more effective deep graph learning models for dynamic [43], large-scale [44], and heterogeneous graphs [45,46]. Moreover, we tend to explore potential downstream applications of this research, for instance, utilizing multi-attributed-view graph representation learning for social recommendation, anomaly detection, cybersecurity, etc.…”
Section: Visualization Of the Node Embeddingsmentioning
confidence: 99%
“…We visualize the embedding results of our approach in a two-dimensional space by applying t-SNE [42], as shown in While this work has shown promising performance beyond existing works, graph representation learning still faces great challenges due to real-world networks' complexity. Consequently, one of our future works will be focusing on developing more effective deep graph learning models for dynamic [43], large-scale [44], and heterogeneous graphs [45,46]. Moreover, we tend to explore potential downstream applications of this research, for instance, utilizing multi-attributed-view graph representation learning for social recommendation, anomaly detection, cybersecurity, etc.…”
Section: Visualization Of the Node Embeddingsmentioning
confidence: 99%
“…Attributed Graph Embedding: Recent studies [1,7,8,11,15,19,21] show that the incorporation of node attributes along with graph structure produces better node embeddings. TADW [19] incorporates text attributes and graph structure with low-rank matrix factorization.…”
Section: Related Workmentioning
confidence: 99%
“…Noise Modelled Graph Embedding: Most of the existing graph embedding methods represent nodes as point vectors in the embedding space, ignoring the uncertainty of the embeddings. In contrast, Graph2Gauss [1], VGAE [11], DVNE [20] and GLACE [8] capture the uncertainty of graph structure by learning node embeddings as Gaussian distributions. DVNE [20] proposes to measure distributional distance using the Wasserstein metric as it preserves transitivity.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Then, we define joint probability between the two node distribution representations as the likelihood of the existence of a link between them by: P (vi, cj) = Sigmoid(−d(zv i , zc j ))) similarly to the first-order proximity measure in GLACE [15]. Since our graph is unweighted, we can define the prior probability using the structural information observed in the graph as: P (vi,cj) = 1 |Evc| .…”
Section: Structural Learning For V -C Graph (Fig 2a)mentioning
confidence: 99%