2017
DOI: 10.1140/epjst/e2016-60347-2
|View full text |Cite
|
Sign up to set email alerts
|

Three perspectives on complexity: entropy, compression, subsymmetry

Abstract: There is no single universally accepted definition of 'Complexity'. There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. In this paper, we explore the following perspectives on complexity: effort-to-describe (Shannon entropy H, Lempel-Ziv complexity LZ), effort-to-compress (ET C complexity) and degree-of-order (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(20 citation statements)
references
References 47 publications
0
20
0
Order By: Relevance
“…Now for this notion of ''complexity'', that has been referred to in this section several times, there is no single unique definition. As noted in Nagaraj & Balasubramanian (2017b), Shannon entropy (Shannon, 1948;Cover & Thomas, 2012) is a very popular and intuitive measure of complexity. A low value of Shannon entropy indicates high redundancy and structure (low complexity) in the data and a high value indicates low redundancy and high randomness (high complexity).…”
Section: Dynamical Complexity (Dc) and Dynamical Compression-compleximentioning
confidence: 99%
See 1 more Smart Citation
“…Now for this notion of ''complexity'', that has been referred to in this section several times, there is no single unique definition. As noted in Nagaraj & Balasubramanian (2017b), Shannon entropy (Shannon, 1948;Cover & Thomas, 2012) is a very popular and intuitive measure of complexity. A low value of Shannon entropy indicates high redundancy and structure (low complexity) in the data and a high value indicates low redundancy and high randomness (high complexity).…”
Section: Dynamical Complexity (Dc) and Dynamical Compression-compleximentioning
confidence: 99%
“…ETC is defined as the effort to compress the input sequence using the lossless compression algorithm known as Non-sequential Recursive Pair Substitution (NSRPS). It has been demonstrated that both LZ and ETC outperform Shannon entropy in accurately characterizing the dynamical complexity of both stochastic (Markov) and deterministic chaotic systems in the presence of noise (Nagaraj & Balasubramanian, 2017a;Nagaraj & Balasubramanian, 2017b). Further, ETC has shown to reliably capture complexity of very short time series where even LZ fails (Nagaraj & Balasubramanian, 2017a), and for analyzing short RR tachograms from healthy young and old subjects (Balasubramanian & Nagaraj, 2016).…”
Section: Dynamical Complexity (Dc) and Dynamical Compression-compleximentioning
confidence: 99%
“…It should be noted that both LZ and ETC are complexity measures derived from lossless data compression algorithms (hence we term them as compression-complexity measures). It has been demonstrated that both LZ and ETC outperform Shannon Entropy in characterizing complexity of noisy time series of short length arising out of stochastic (markov) and chaotic systems [37,39,40]. Further, ETC consistently performs better than LZ in a number of applications as shown in recently published literature [39][40][41][42].…”
Section: • Strength Of the Hydrogen Bondsmentioning
confidence: 81%
“…It has been demonstrated that both LZ and ETC outperform Shannon Entropy in characterizing complexity of noisy time series of short length arising out of stochastic (markov) and chaotic systems [37,39,40]. Further, ETC consistently performs better than LZ in a number of applications as shown in recently published literature [39][40][41][42]. For details of how to compute LZ and ETC on actual input sequences, we refer the readers to [22,37,43].…”
Section: • Strength Of the Hydrogen Bondsmentioning
confidence: 98%
“…Now for this notion of "complexity", that has been referred to in this section several times, there is no single unique definition. As noted in Nagaraj and Balasubramanian (2017b), Shannon entropy (Cover and Thomas, 2012) is a very popular and intuitive measure of complexity. A low value of Shannon entropy indicates high redundancy and structure (low complexity) in the data and a high value indicates low redundancy and high randomness (high complexity).…”
Section: /19mentioning
confidence: 99%