2021
DOI: 10.3390/e23121620
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale Entropy Analysis of Short Signals: The Robustness of Fuzzy Entropy-Based Variants Compared to Full-Length Long Signals

Abstract: Multiscale entropy (MSE) analysis is a fundamental approach to access the complexity of a time series by estimating its information creation over a range of temporal scales. However, MSE may not be accurate or valid for short time series. This is why previous studies applied different kinds of algorithm derivations to short-term time series. However, no study has systematically analyzed and compared their reliabilities. This study compares the MSE algorithm variations adapted to short time series on both human… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…In other words, it means that the energy of 1/f noise contains complex dynamics that vary depending on frequency. Through analysis of multiscale based entropy, it appears that 1/f noise is a more complex structure than WGN [9], [27], [32], [33]. Multiscale entropy values for the synthetic signals, WGN and 1/f noise, provide important insight with respect to physiological complexity.…”
Section: Evaluation Signals a Synthetic Signalsmentioning
confidence: 99%
“…In other words, it means that the energy of 1/f noise contains complex dynamics that vary depending on frequency. Through analysis of multiscale based entropy, it appears that 1/f noise is a more complex structure than WGN [9], [27], [32], [33]. Multiscale entropy values for the synthetic signals, WGN and 1/f noise, provide important insight with respect to physiological complexity.…”
Section: Evaluation Signals a Synthetic Signalsmentioning
confidence: 99%
“…The cross-entropy loss function, also regarded as log loss, is the most commonly used loss function for back propagation. The cross-entropy loss function increases as the predicted probability deviates from the actual label, and can be described as follows L ce ŷi , y i = − ∑ i y i log ŷi (25) In this paper, the label l n of each image is used, which is only assumed to be 1 for images belonging to the same class of images during testing, and 0 otherwise. The crossentropy formula can be expressed as…”
Section: Proposed Network With Rmee Criterionmentioning
confidence: 99%
“…Previously, Chen et al proposed researches focusing on maximum correntropy theory and minimum error entropy criteria to improve the robustness of machine learning theory [20][21][22]. In addition, a series of entropy-based learning algorithms have been presented to deal with the robustness improvement of machine learning models, including guided complement entropy and fuzzy entropy [23][24][25]. Nevertheless, there is no application of the ITL-based approach in the spike-based continual meta-learning to improve its learning robustness.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, the set of HRV indices explored here, although comprehensive, is not extensive. The prognostic value of several other entropy [ 58 – 60 ], fractal [ 61 ] and general [ 62 64 ] methods in CCC should be investigated in future studies.…”
Section: Limitationsmentioning
confidence: 99%