2022 56th Asilomar Conference on Signals, Systems, and Computers 2022
DOI: 10.1109/ieeeconf56349.2022.10052045
|View full text |Cite
|
Sign up to set email alerts
|

Online Filtering over Expanding Graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 26 publications
0
1
0
Order By: Relevance
“…We perform online forecasting on the expanding graph using the GEVAR filter bank. As an alternative to batch processing, we update the filter bank online using tools from online machine learning, which has already been used for graph signal processing tasks in time invariant setting [9,24,25]. Specifically, we use adaptive online gradient descent, with a provable sub-linear regret upper bound w.r.t the batch solution [26].…”
Section: Introductionmentioning
confidence: 99%
“…We perform online forecasting on the expanding graph using the GEVAR filter bank. As an alternative to batch processing, we update the filter bank online using tools from online machine learning, which has already been used for graph signal processing tasks in time invariant setting [9,24,25]. Specifically, we use adaptive online gradient descent, with a provable sub-linear regret upper bound w.r.t the batch solution [26].…”
Section: Introductionmentioning
confidence: 99%
“…To account for the underlying topological dynamics, we propose an online framework to perform edge flow prediction over an expanding simplicial complex. In detail, we first use the simplicial convolutional filter proposed in [10,21] as the flow predictor on the new edge; then we consider an online gradient descent algorithm to update the filter parameters [22,23], generalizing the methods developed for the node signal space [24,25,26]. Online gradient descent, as a simple update rule, has sub-linear regret bounds [22].…”
Section: Introductionmentioning
confidence: 99%