2019
DOI: 10.3390/e21090840
|View full text |Cite
|
Sign up to set email alerts
|

Interpretation of Entropy Algorithms in the Context of Biomedical Signal Analysis and Their Application to EEG Analysis in Epilepsy

Abstract: Biomedical signals are measurable time series that describe a physiological state of a biological system. Entropy algorithms have been previously used to quantify the complexity of biomedical signals, but there is a need to understand the relationship of entropy to signal processing concepts. In this study, ten synthetic signals that represent widely encountered signal structures in the field of signal processing were created to interpret permutation, modified permutation, sample, quadratic sample and fuzzy en… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(12 citation statements)
references
References 42 publications
0
12
0
Order By: Relevance
“…Shannon entropy [26], Rényi entropy [29], and average entropy [28] can be applied globally to all data, or locally only to points around specific points [39]. However, they ignore the temporal order of the patterns in the signal [40].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Shannon entropy [26], Rényi entropy [29], and average entropy [28] can be applied globally to all data, or locally only to points around specific points [39]. However, they ignore the temporal order of the patterns in the signal [40].…”
Section: Discussionmentioning
confidence: 99%
“…Permutation entropy [31] and edge permutation entropy [32] use the temporal information [39], but they rely on the occurrence of equal values in the sub-series [41]. Approximate entropy [33] has the advantage of lower computational demand and less effect from noise, but it strongly depends on the time-series length and therefore lacks consistency [40]. Sample entropy [34] is invariant to the time-series length and it performs more consistently under various conditions.…”
Section: Discussionmentioning
confidence: 99%
“…We used mean and variance to capture main characteristics of the signals in the time domain. Furthermore, we calculated entropy, a measure from the information domain to infer the state of the respective ANS subsystem; entropy has previously been suggested to differentiate between ictal and non-ictal segments of intracranial EEG data 15 .…”
Section: Discussionmentioning
confidence: 99%
“…We chose signal entropy as a potential signal marker of interest, since entropy has been shown to increase prior to certain state changes in neural systems before 14 . Related to epilepsy entropy was previously used for example in EEG 15 and ECG 16 data. Entropy was calculated for each segment as H = −p log p, with log denoting the logarithm to base 2 and p being the probability density obtained by binning the data into n bins.…”
Section: Data Recording We Recruited Patients Admitted To the Long-tmentioning
confidence: 99%
“…Recent studies for biological systems offered novel approaches to extract features based on single and multiscale entropy measures in order to achieve high classification accuracy. Especially, extracted features from EEG signal-based entropy helps researchers in the early diagnosis of epilepsy, different types of sleep disorders, and brain-related disorders such as Alzheimers [ 45 ]. Acharya et al [ 46 ] extracted features from EEG signals by using ApEn, SampEn, and Phase Entropies (S1 and S2) for the purpose of detecting epilepsy.…”
Section: Discussionmentioning
confidence: 99%