2017
DOI: 10.3390/e19030134
|View full text |Cite
|
Sign up to set email alerts
|

Permutation Entropy: New Ideas and Challenges

Abstract: Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
26
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 63 publications
(26 citation statements)
references
References 30 publications
0
26
0
Order By: Relevance
“…This normalization motivates a recently introduced, alternative interpretation of permutation entropy as the Kullback-Leibler divergence (KL divergence) of the deviation of the empirical distribution from that of white noise (see [16,25] for some exposition about this perspective). The KL divergence was originally defined in [26] to measure the information theoretic distance between a distribution and another expected distribution, and has become a standard measure in many fields.…”
Section: Permutation Entropy and Kl Divergencementioning
confidence: 89%
See 1 more Smart Citation
“…This normalization motivates a recently introduced, alternative interpretation of permutation entropy as the Kullback-Leibler divergence (KL divergence) of the deviation of the empirical distribution from that of white noise (see [16,25] for some exposition about this perspective). The KL divergence was originally defined in [26] to measure the information theoretic distance between a distribution and another expected distribution, and has become a standard measure in many fields.…”
Section: Permutation Entropy and Kl Divergencementioning
confidence: 89%
“…Datasets of a similar scale are increasingly available in the current big data paradigm, and permutation methods are well positioned to contribute to comprehensive and meaningful analyses. Three recent surveys [16][17][18] provide a comprehensive overview of recent developments and applications.…”
Section: Introductionmentioning
confidence: 99%
“…It has been applied in the fields of biomedical sciences [8,9], mechanical diagnosis [10] and underwater acoustic signal processing [11]. PE is one of the most effective ways to detect the randomness and dynamic changes of time sequence based on comparison of neighboring values [12][13][14]. However, NPE [15], a new kind of PE, was proposed to classify different sleep stages by Bandt in 2017.…”
Section: Introductionmentioning
confidence: 99%
“…Here we suggest a statistic on the basis of the conditional entropy of ordinal patterns introduced in [18]. The latter is a complexity measure similar to the celebrated permutation entropy [9] with particularly better performance (see [17,18]). Let us provide an "obvious" example only to motivate our approach and to illustrate its idea.…”
Section: Introductionmentioning
confidence: 99%
“…A result of this transformation is demonstrated in Figure 1 for order d = 1. Note that the distribution of ordinal patterns contain much information on the original time series making them interesting for data analysis, especially for data from nonlinear systems (see [16,17]). …”
Section: Introductionmentioning
confidence: 99%