2020
DOI: 10.48550/arxiv.2010.09429
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Additive Vector Autoregression Models for Causal Discovery in Time Series

Bart Bussmann,
Jannes Nys,
Steven Latré

Abstract: Causal structure discovery in complex dynamical systems is an important challenge for many scientific domains. Although data from (interventional) experiments is usually limited, large amounts of observational time series data sets are usually available. Current methods that learn causal structure from time series often assume linear relationships. Hence, they may fail in realistic settings that contain nonlinear relations between the variables. We propose Neural Additive Vector Autoregression (NAVAR) models, … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…Model-free approaches such as transfer entropy (Vicente et al, 2011) are able to detect nonlinear dependencies between time series, however they suffer from high variance and require large amounts of data for reliable estimation (Tank et al, 2021). In this work, we follow a recent trend that uses neural networks to infer complex nonlinear causal dependencies in time series data (Khanna & Tan, 2020;Nauta et al, 2019;Tank et al, 2021;Bussmann et al, 2020;Trifunov et al, 2019;De Brouwer et al, 2020;Marcinkevičs & Vogt, 2021;Moraffah et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Model-free approaches such as transfer entropy (Vicente et al, 2011) are able to detect nonlinear dependencies between time series, however they suffer from high variance and require large amounts of data for reliable estimation (Tank et al, 2021). In this work, we follow a recent trend that uses neural networks to infer complex nonlinear causal dependencies in time series data (Khanna & Tan, 2020;Nauta et al, 2019;Tank et al, 2021;Bussmann et al, 2020;Trifunov et al, 2019;De Brouwer et al, 2020;Marcinkevičs & Vogt, 2021;Moraffah et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Tank et al (2021) the function g i is parameterised by a multilayer perceptron (MLP) regularised by group lasso penalties and trained with proximal gradient descent to shrink the input weights of lagged values of non-causal time series to zero Bussmann et al (2020). propose a neural additive VAR model with each time series expressed as a sum of nonlinear functions of the other time series.…”
mentioning
confidence: 99%