2021 IEEE Data Science and Learning Workshop (DSLW) 2021
DOI: 10.1109/dslw51110.2021.9523399
|View full text |Cite
|
Sign up to set email alerts
|

Online Non-linear Topology Identification from Graph-connected Time Series

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 14 publications
0
14
0
Order By: Relevance
“…However, notice that the estimates of LMS are prone to observation noise and can be unstable in practice. To avoid this problem, we formulate (16) in a recursive least square (RLS) sense, which further provides necessary stability in addition to faster convergence:…”
Section: Online Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…However, notice that the estimates of LMS are prone to observation noise and can be unstable in practice. To avoid this problem, we formulate (16) in a recursive least square (RLS) sense, which further provides necessary stability in addition to faster convergence:…”
Section: Online Learningmentioning
confidence: 99%
“…The above discussion motivates the need for algorithms that can learn nonlinear and dynamic topologies. Kernels are an ideal choice in this regard due to their interpretability and capability to learn functions online [14]- [16]. In kernel frameworks, the data points are transformed to a function space, where a linear relationship exists between them.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The above discussion motivates the need for algorithms that can learn nonlinear and non-stationary topologies. Kernels are an ideal choice in this regard due to their interpretability and capability to learn functions online [20]- [22]. However, a major drawback associated with the kernel-based algorithms is the curse of dimensionality [23], i.e., the number of parameters required to express the function increases with number of data samples, and the computational complexity becomes prohibitive at some point.…”
Section: Introductionmentioning
confidence: 99%
“…The nonlinearity is tackled using kernel methods, and the curse of dimensionality of kernels is mitigated through random feature (RF) approximation. Kernel techniques are conventionally used for nonlinear topology estimation [14]- [16] and help transform the problem into an amenable form. Instead, RF is typically used to reduce the complexity of nonlinear models as well as to ensure that connectivity is inferred without revealing nodal attributes [17]- [22].…”
Section: Introductionmentioning
confidence: 99%