2011
DOI: 10.3390/e13061055
|View full text |Cite
|
Sign up to set email alerts
|

Distances in Probability Space and the Statistical Complexity Setup

Abstract: Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a "disequilibrium" and is denote… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
55
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 56 publications
(55 citation statements)
references
References 47 publications
0
55
0
Order By: Relevance
“…of (i) the normalized Shannon entropy and (ii) the so-called disequilibrium Q J , which is defined in terms of the extensive (in the thermodynamical sense) Jensen-Shannon divergence J [P, P e ] that links two PDFs [36]. The Jensen-Shannon divergence, which quantifies the difference between two (or more) probability distributions, is especially useful to compare the symbol-composition of different sequences [37].…”
Section: The Statistical Complexity and The Complexity-entropy Planementioning
confidence: 99%
“…of (i) the normalized Shannon entropy and (ii) the so-called disequilibrium Q J , which is defined in terms of the extensive (in the thermodynamical sense) Jensen-Shannon divergence J [P, P e ] that links two PDFs [36]. The Jensen-Shannon divergence, which quantifies the difference between two (or more) probability distributions, is especially useful to compare the symbol-composition of different sequences [37].…”
Section: The Statistical Complexity and The Complexity-entropy Planementioning
confidence: 99%
“…In this paper the statistical complexity is assessed through the information carried by the signal [16,12]. This approach provides a link between the entropy of the random source that generates the signal and the distance of the probability distribution of the generating source p to the uniform distribution p e .…”
Section: Statistical Complexitymentioning
confidence: 99%
“…In the following we will only employ the Hellinger divergence which is also a distance, for which h(y) = y/2, 0 ≤ y < 2 and φ(x) = ( √ x − 1) 2 . The influence of the choice of a distance when computing statistical complexities is studied in Reference [11]. Following Rosso et al [17], we work with the Hellinger distance and we define the Statistical Complexity of coordinate (i, j) in an intensity SAR image as the product…”
Section: Generalized Measure Of Statistical Complexitymentioning
confidence: 99%