2019
DOI: 10.1063/1.5134845
|View full text |Cite
|
Sign up to set email alerts
|

Using machine learning to assess short term causal dependence and infer network links

Abstract: We introduce and test a general machine-learning-based technique for the inference of short term causal dependence between state variables of an unknown dynamical system from time series measurements of its state variables. Our technique leverages the results of a machine learning process for short time prediction to achieve our goal. The basic idea is to use the machine learning to estimate the elements of the Jacobian matrix of the dynamical flow along an orbit. The type of machine learning that we employ is… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 37 publications
(18 citation statements)
references
References 49 publications
0
18
0
Order By: Relevance
“…In this context, reservoir computing (RC) [29][30][31], a version of recurrent neural network model, is effective for inference of unmeasured variables in chaotic systems using values of a known variable [32], forecasting dynamics of chaotic oscillators [33,34], predicting the evolution of the phase of chaotic dynamics [35], and prediction of critical transition in dynamical systems [36]. Also, RC is used to detect synchronization [37][38][39], spiking-bursting phenomena [40], inferring network links [41] in coupled systems. Apart from the RC, researchers have also applied different architectures of artificial neural networks such as feed-forward neural network (FFNN) [42,43], long-short term memory (LSTM) [44][45][46] for different purposes such as detecting phase transition in complex network [47], and functional connectivity in coupled systems [48], forecasting of complex dynamics [49].…”
Section: Introductionmentioning
confidence: 99%
“…In this context, reservoir computing (RC) [29][30][31], a version of recurrent neural network model, is effective for inference of unmeasured variables in chaotic systems using values of a known variable [32], forecasting dynamics of chaotic oscillators [33,34], predicting the evolution of the phase of chaotic dynamics [35], and prediction of critical transition in dynamical systems [36]. Also, RC is used to detect synchronization [37][38][39], spiking-bursting phenomena [40], inferring network links [41] in coupled systems. Apart from the RC, researchers have also applied different architectures of artificial neural networks such as feed-forward neural network (FFNN) [42,43], long-short term memory (LSTM) [44][45][46] for different purposes such as detecting phase transition in complex network [47], and functional connectivity in coupled systems [48], forecasting of complex dynamics [49].…”
Section: Introductionmentioning
confidence: 99%
“…Since the seminal work by Jaeger et al [1,2], it has opened up a new direction in the field of ESN based machine learning approach [3][4][5][6][7][8]. Due to its simplicity, ESN is appeared as a considerable tool in different areas ranging from neuroscience [9,10], speech recognition [11], language processing [12], robotics [13] to stock market prediction [14], inference of connectivity [15,16], network classification [17] and even in predicting recent COVID-19 epidemic trends [18]. There are also some studies which have focused on the suitable choices of reservoir weights [4,19,20] and on the optimal range of hyper parameters [21][22][23][24][25][26][27][28][29][30], which are crucial for a good prediction.…”
Section: Introductionmentioning
confidence: 99%
“…The RC approach to machine learning has been successfully applied to a number of problems, for example, time series prediction 11 , visual identification tasks 12 , real-time detection of epileptic seizures 13 , and inferring from limited time series data; unmeasured state variables 14 , Lyapunov exponents 15 , and causal dependencies between variables 16 .…”
Section: Introductionmentioning
confidence: 99%