2022 IEEE Data Science and Learning Workshop (DSLW) 2022
DOI: 10.1109/dslw53931.2022.9820066
|View full text |Cite
|
Sign up to set email alerts
|

Graph Filtering Over Expanding Graphs

Abstract: Data processing tasks over graphs couple the data residing over the nodes with the topology through graph signal processing tools. Graph filters are one such prominent tool, having been used in applications such as denoising, interpolation, and classification. However, they are mainly used on fixed graphs although many networks grow in practice, with nodes continually attaching to the topology. Re-training the filter every time a new node attaches is computationally demanding; hence an online learning solution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 37 publications
0
1
0
Order By: Relevance
“…To account for the underlying topological dynamics, we propose an online framework to perform edge flow prediction over an expanding simplicial complex. In detail, we first use the simplicial convolutional filter proposed in [10,21] as the flow predictor on the new edge; then we consider an online gradient descent algorithm to update the filter parameters [22,23], generalizing the methods developed for the node signal space [24,25,26]. Online gradient descent, as a simple update rule, has sub-linear regret bounds [22].…”
Section: Introductionmentioning
confidence: 99%
“…To account for the underlying topological dynamics, we propose an online framework to perform edge flow prediction over an expanding simplicial complex. In detail, we first use the simplicial convolutional filter proposed in [10,21] as the flow predictor on the new edge; then we consider an online gradient descent algorithm to update the filter parameters [22,23], generalizing the methods developed for the node signal space [24,25,26]. Online gradient descent, as a simple update rule, has sub-linear regret bounds [22].…”
Section: Introductionmentioning
confidence: 99%