2019
DOI: 10.3390/e21080775
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Information Metrics for a Coupled Ornstein–Uhlenbeck Process

Abstract: It is often the case when studying complex dynamical systems that a statistical formulation can provide the greatest insight into the underlying dynamics. When discussing the behavior of such a system which is evolving in time, it is useful to have the notion of a metric between two given states. A popular measure of information change in a system under perturbation has been the relative entropy of the states, as this notion allows us to quantify the difference between states of a system at different times. In… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
74
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

5
2

Authors

Journals

citations
Cited by 24 publications
(75 citation statements)
references
References 27 publications
1
74
0
Order By: Relevance
“…. 26Using Equations (25) and (26) in Equations (6) and 7, we have the time-dependent (joint) PDF (5) at any time t for our system (23) and (24). To calculate Equation 11with the help of Equations (25) and 26 Figure 1c,d show the time-evolution of the information velocity E (t) and information length L(t), respectively, for different values of D ∈ (0.0005, 0.04).…”
Section: Kramers Equationmentioning
confidence: 99%
See 2 more Smart Citations
“…. 26Using Equations (25) and (26) in Equations (6) and 7, we have the time-dependent (joint) PDF (5) at any time t for our system (23) and (24). To calculate Equation 11with the help of Equations (25) and 26 Figure 1c,d show the time-evolution of the information velocity E (t) and information length L(t), respectively, for different values of D ∈ (0.0005, 0.04).…”
Section: Kramers Equationmentioning
confidence: 99%
“…For instance, when these two PDFs represent the two states at different times, the relative entropy between them tells us nothing about how one PDF evolves to the other PDF over time or what intermediate states a system passes through between the two PDFs. As a result, it can only inform us of the changes that affect the overall system evolution [ 26 ]. To overcome this limitation, the information length was proposed in recent works, which quantifies the total number of different states that the system evolves through in time [ 27 , 28 ].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The linear relation indicates that a linear process preserves a linearity of the underlying process. Heseltine & Kim [ 18 ] shows that this linear relation is lost for other metrics (e.g., Kullback–Leibler divergence, Jensen divergence). Note that is related to the integral of the square root of the infinitesimal relative entropy (see Appendix A ).…”
Section: Introductionmentioning
confidence: 99%
“…It is this attractor structure that we are interested in in this paper. We thus focus on the relaxation problem as in [ 9 , 12 , 15 , 18 ] by considering periodic deterministic forces and elucidate the importance of the initial condition and its interplay with the deterministic forces in the relaxation and thus attractor structure.…”
Section: Introductionmentioning
confidence: 99%