2021
DOI: 10.48550/arxiv.2110.05357
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph-Guided Network for Irregularly Sampled Multivariate Time Series

Abstract: In many domains, including healthcare, biology, and climate science, time series are irregularly sampled with variable time between successive observations and different subsets of variables (sensors) are observed at different time points, even after alignment to start events. These data create multiple challenges for prevailing models that assume fully observed and fixed-length feature representations. To address these challenges, it is essential to understand the relationships between sensors and how they ev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…To model temporal session data in discrete state spaces, a graph nested GRU ODE model is proposed to preserve the continuous nature of dynamic user preferences, where a graph gated neural network is employed to encode both temporal and structural patterns for inferring initial latent states and a time alignment algorithm is designed to align the updating time steps of temporal session graphs [30]. For irregularly sampled and multivariate time series, RAINDROP represents every sample as a separate sensor graph and captures timevarying dependencies between sensors with a novel message passing operator [31]. Additionally, to model a dynamical system like the COVID-19 pandemic, a coupled graph ODE model is established, which learns the coupled dynamics of nodes and edges with a graph neural network-based ODE in a continuous manner [32].…”
Section: One-stage Methodsmentioning
confidence: 99%
“…To model temporal session data in discrete state spaces, a graph nested GRU ODE model is proposed to preserve the continuous nature of dynamic user preferences, where a graph gated neural network is employed to encode both temporal and structural patterns for inferring initial latent states and a time alignment algorithm is designed to align the updating time steps of temporal session graphs [30]. For irregularly sampled and multivariate time series, RAINDROP represents every sample as a separate sensor graph and captures timevarying dependencies between sensors with a novel message passing operator [31]. Additionally, to model a dynamical system like the COVID-19 pandemic, a coupled graph ODE model is established, which learns the coupled dynamics of nodes and edges with a graph neural network-based ODE in a continuous manner [32].…”
Section: One-stage Methodsmentioning
confidence: 99%
“…(3) Forward imputation—use the last known observation data. Note that we were not able to try the more recent state-of-the-art methods for dealing with irregularly sampled time series data [ 10 , 60 , 80 , 81 ], since they do not offer a solution to impute missing values; instead, they propose a complex model that is not necessarily applicable to the problem of HAR in smart homes. The results in Table A2 show that forward imputation provides a significant increase in the effectiveness of our approach across several CASAS datasets.…”
Section: Figure A1mentioning
confidence: 99%
“…In parallel, researchers started investigating the application of graph neural networks on non-graph-structured data, such as regular and irregular time-series data [11]. In such cases, a simple graph structure is imposed a-priori (e.g., based on distances) [12] or is automatically infererred by the neural network [13]. Few works investigated the application of GNNs to the vision domain for different tasks, mainly related to point clouds [14], with the Vision GNN architecture [15] (ViG) being the most successful architecture in image classification, achieving higher performances in the image classification task compared to the ViT architecture [10].…”
Section: Introductionmentioning
confidence: 99%