2021
DOI: 10.48550/arxiv.2110.08565
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Dynamic Graph Echo State Networks

Domenico Tortorella,
Alessio Micheli

Abstract: Dynamic temporal graphs represent evolving relations between entities, e.g. interactions between social network users or infection spreading. We propose an extension of graph echo state networks for the efficient processing of dynamic temporal graphs, with a sufficient condition for their echo state property, and an experimental analysis of reservoir layout impact. Compared to temporal graph kernels that need to hold the entire history of vertex interactions, our model provides a vector encoding for the dynami… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Weights: ESN reservoir weights are fixed, random values from a uniform distribution on [−1, 1], by convention [18,17]. Some works have sought to incorporate dynamic weights: [20], for example, found that a dynamic graph ESN could perform twelve classification tasks with similar accuracy to non-dynamic graph ESNs, but with reduced memory requirements. The authors of [2] sample reservoir weights from uniform, arcsine, and gaussian distributions: they find that an arcsine distribution can improve time series prediction performance, but speculate that it is the sparsity resulting from the distribution (and not the sampled values themselves) that is responsible.…”
Section: Structural Features Of Echo State Networkmentioning
confidence: 99%
“…Weights: ESN reservoir weights are fixed, random values from a uniform distribution on [−1, 1], by convention [18,17]. Some works have sought to incorporate dynamic weights: [20], for example, found that a dynamic graph ESN could perform twelve classification tasks with similar accuracy to non-dynamic graph ESNs, but with reduced memory requirements. The authors of [2] sample reservoir weights from uniform, arcsine, and gaussian distributions: they find that an arcsine distribution can improve time series prediction performance, but speculate that it is the sparsity resulting from the distribution (and not the sampled values themselves) that is responsible.…”
Section: Structural Features Of Echo State Networkmentioning
confidence: 99%