2019
DOI: 10.3390/info10110358
|View full text |Cite
|
Sign up to set email alerts
|

Information Gain in Event Space Reflects Chance and Necessity Components of an Event

Abstract: Information flow for occurrences in phase space can be assessed through the application of the Lyapunov characteristic exponent (multiplicative ergodic theorem), which is positive for non-linear systems that act as information sources and is negative for events that constitute information sinks. Attempts to unify the reversible descriptions of dynamics with the irreversible descriptions of thermodynamics have replaced phase space models with event space models. The introduction of operators for time and entrop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…By contrast, we are enabled to measure new information produced by an event (through the Lyapunov characteristic exponent) and the degree of complexity inherent in the event (through the information dimension) in order to gain relevant, quantitative readouts (Section 3). Rather than a dichotomy between chance and necessity, quantifiable information evolution and information dimension, as well as entropy, are suitable scientific measures for the complexity of an occurrence, which can be interpreted as being reflective of the "ordered" and "chance" components that contribute to this occurrence [5].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…By contrast, we are enabled to measure new information produced by an event (through the Lyapunov characteristic exponent) and the degree of complexity inherent in the event (through the information dimension) in order to gain relevant, quantitative readouts (Section 3). Rather than a dichotomy between chance and necessity, quantifiable information evolution and information dimension, as well as entropy, are suitable scientific measures for the complexity of an occurrence, which can be interpreted as being reflective of the "ordered" and "chance" components that contribute to this occurrence [5].…”
Section: Discussionmentioning
confidence: 99%
“…However, due to the reliance on the eigenvectors and eigenvalues of the operators for time and thermodynamic entropy, an assessment of the information evolution associated with an event now requires algorithms that are independent of trajectories in phase space. To achieve this, a reformulation of the Lyapunov characteristic exponent has been applied to the vector-based event space [5]. (8).…”
Section: Conflicts Of Interestmentioning
confidence: 99%
See 1 more Smart Citation