2021
DOI: 10.3390/e23121672
|View full text |Cite
|
Sign up to set email alerts
|

Combining Measures of Signal Complexity and Machine Learning for Time Series Analyis: A Review

Abstract: Measures of signal complexity, such as the Hurst exponent, the fractal dimension, and the Spectrum of Lyapunov exponents, are used in time series analysis to give estimates on persistency, anti-persistency, fluctuations and predictability of the data under study. They have proven beneficial when doing time series prediction using machine and deep learning and tell what features may be relevant for predicting time-series and establishing complexity features. Further, the performance of machine learning approach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(10 citation statements)
references
References 93 publications
0
10
0
Order By: Relevance
“…Fractal dimension and Hurst exponent analysis are based on a different theoretical framework than entropy-based methods, and they can be found as complexity measures in functional MRI studies (Bullmore et al, 2009;Tagliazucchi et al, 2013;Dong et al, 2018), EEG studies (Linkenkaer-Hansen et al, 2001;Raghavendra et al, 2009;Sabeti et al, 2009;Holloway et al, 2014), but also outside brain research (Raubitzek and Neubauer, 2021).…”
Section: Hurst Exponentsmentioning
confidence: 99%
“…Fractal dimension and Hurst exponent analysis are based on a different theoretical framework than entropy-based methods, and they can be found as complexity measures in functional MRI studies (Bullmore et al, 2009;Tagliazucchi et al, 2013;Dong et al, 2018), EEG studies (Linkenkaer-Hansen et al, 2001;Raghavendra et al, 2009;Sabeti et al, 2009;Holloway et al, 2014), but also outside brain research (Raubitzek and Neubauer, 2021).…”
Section: Hurst Exponentsmentioning
confidence: 99%
“…It must be noted that non-linear extensions of G-causality can be computationally expensive [ 66 ]; however, according to Granger, non-linear models reflect a more proper approach to model practical problems, which are inherently non-linear [ 70 ]. Since Schreiber transfer entropy is a non-linear approach, it can be used to measure ‘causality’ using non-linear data [ 71 ]. In this sub-section, this paper investigates non-linear Schreiber transfer entropy and its relationship with G-causality for the KLFIN.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Entropy is measuring any information uncertainty, diversity or randomness [ 71 ]. Low entropy implies that information uncertainty is low.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…There are many techniques for estimating entropy of time series, such as sample entropy, approximate entropy, Kolmogorov-Sinai (K-S) entropy, the K 2 entropy, and the multiscale entropy. K-S and the K 2 entropy require a large amount of data for proper complexity quantification (Raubitzek and Neubauer 2021). Approximate entropy and sample entropy, however, can deal with short-length signals (Richman and Moorman 2000).…”
Section: Introductionmentioning
confidence: 99%