2022
DOI: 10.1016/j.ins.2022.04.045
|View full text |Cite
|
Sign up to set email alerts
|

Graph correlated attention recurrent neural network for multivariate time series forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(7 citation statements)
references
References 32 publications
0
7
0
Order By: Relevance
“…Great effort is put into developing effective techniques to model incomplete graph data. For instance, an adaptive graph recurrent network has been proposed, which combines graph and RNNs for air quality and traffic data imputation [29]. To model temporal session data in discrete state spaces, a graph nested GRU ODE model is proposed to preserve the continuous nature of dynamic user preferences, where a graph gated neural network is employed to encode both temporal and structural patterns for inferring initial latent states and a time alignment algorithm is designed to align the updating time steps of temporal session graphs [30].…”
Section: One-stage Methodsmentioning
confidence: 99%
“…Great effort is put into developing effective techniques to model incomplete graph data. For instance, an adaptive graph recurrent network has been proposed, which combines graph and RNNs for air quality and traffic data imputation [29]. To model temporal session data in discrete state spaces, a graph nested GRU ODE model is proposed to preserve the continuous nature of dynamic user preferences, where a graph gated neural network is employed to encode both temporal and structural patterns for inferring initial latent states and a time alignment algorithm is designed to align the updating time steps of temporal session graphs [30].…”
Section: One-stage Methodsmentioning
confidence: 99%
“…Recent studies have bridged this gap by combining GNNs with Recurrent Neural Networks (RNNs), exploring the interplay of spatial and temporal changes 37 and achieving promising results. The advent of Encoder-Decoder architectures, renowned for their efficacy in processing sequential data 38 , has further propelled innovations in TSF.…”
Section: Forecastingmentioning
confidence: 99%
“…GATs introduced a self-attention mechanism that assigns different weights to each node in the graph based on its neighbor node features while computing its representation. For instance, frameworks such as GAT-LSTM [ 36 ], TC-GATN [ 37 ], and GCAR [ 38 ] utilize attention graph networks to model intricate and dynamic spatial correlations, leading to improved prediction performance. The advantage of GATs lies in their ability to train without requiring knowledge of the entire graph structure; only the neighboring nodes of each node are considered, leading to parallel computation on different nodes and fast computation speed.…”
Section: Related Workmentioning
confidence: 99%