2016
DOI: 10.1209/0295-5075/115/58003
|View full text |Cite
|
Sign up to set email alerts
|

Scaling of information in turbulence

Abstract: Cf -Entropy in information theory PACS 47.27.Jv -High-Reynolds-number turbulence PACS 89.75.Da -Scaling phenomena in complex systems Abstract -We propose a new perspective on Turbulence using Information Theory. We compute the entropy rate of a turbulent velocity signal and we particularly focus on its dependence on the scale. We first report how the entropy rate is able to describe the distribution of information amongst scales, and how one can use it to isolate the injection, inertial and dissipative ranges,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 23 publications
0
13
0
Order By: Relevance
“…Interestingly, this splits h (m,τ ) (X) into two contributions: H(X) only depends on the one-point statistics of X and is hence a static property. I (m,1,τ ) (X) gathers all information conveyed by linear and non linear temporal dynamics, irrespective of the variance of X [21]. For illustration, when X is a stationary jointly Gaussian process, hence fully defined by its variance σ 2 and normalized correlation function c(τ ) (such that c(τ = 0) = 1), one has:…”
Section: A Entropy Entropy Rate and Mutual Informationmentioning
confidence: 99%
“…Interestingly, this splits h (m,τ ) (X) into two contributions: H(X) only depends on the one-point statistics of X and is hence a static property. I (m,1,τ ) (X) gathers all information conveyed by linear and non linear temporal dynamics, irrespective of the variance of X [21]. For illustration, when X is a stationary jointly Gaussian process, hence fully defined by its variance σ 2 and normalized correlation function c(τ ) (such that c(τ = 0) = 1), one has:…”
Section: A Entropy Entropy Rate and Mutual Informationmentioning
confidence: 99%
“…The entropy rate h(τ) T evolution with the time scale τ (Figure 2a) reveals three different regimes, as would the power spectrum [24]. Between the small and the large scales, indicated by vertical dashed lines, the entropy rate evolves linearly in ln τ with a slope H = 1/3, just as it would have for a traditional fBm: this is the inertial regime.…”
Section: Regularized and Stationarized Fbmmentioning
confidence: 75%
“…For the fGn, these two quantities converge to a constant value when τ is increased, but it can be seen that the entropy of the increments converges from above when H < 1/2. For the fBm, the two quantities increase linearly in ln τ, with a slope that is exactly the Hurst exponent H [23,24]. For the time-integrated fBm, which has a generalized Hurst exponent larger than 1, the two quantities also evolve linearly in ln τ, but with a constant slope 1 independent on H. This indicates that neither the entropy of the increments nor the entropy rate can be used to estimate H ≥ 1.…”
Section: Numerical Observationsmentioning
confidence: 99%
See 1 more Smart Citation
“…T evolution with the time scale τ (Figure 2a) reveals three different regimes, as would the power spectrum [24]. Between the small and the large scales, indicated by vertical dashed lines, the entropy rate evolves linearly in ln τ with a slope H = 1/3, just as it would have for a traditional fBm: this is the inertial regime.…”
Section: The Entropy Rate H(τ)mentioning
confidence: 78%