2020
DOI: 10.48550/arxiv.2006.14330
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Time-varying Graph Representation Learning via Higher-Order Skip-Gram with Negative Sampling

Abstract: Representation learning models for graphs are a successful family of techniques that project nodes into feature spaces that can be exploited by other machine learning algorithms. Since many real-world networks are inherently dynamic, with interactions among nodes changing over time, these techniques can be defined both for static and for time-varying graphs. Here, we build upon the fact that the skip-gram embedding approach implicitly performs a matrix factorization, and we extend it to perform implicit tensor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 44 publications
(59 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?