2005
DOI: 10.1098/rspb.2004.2889
|View full text |Cite
|
Sign up to set email alerts
|

Inferences about information flow and dispersal for spatially extended population systems using time-series data

Abstract: This work explores an information-theoretic approach to drawing inferences about coupling of spatially extended ecological populations based solely on time-series of abundances. The efficacy of the approach, time-delayed mutual information, was explored using a spatially extended predator-prey model system in which populations at different patches were coupled via diffusive movement. The approach identified the relative magnitude and direction of information flow resulting from animal movement between populati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2006
2006
2018
2018

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 47 publications
1
6
0
Order By: Relevance
“…7 have a slope equal to the information transfer velocity [Eq. (13)]; these lines closely match the perturbation wavefront. On the one hand this is to be expected since the perturbation can be considered as a piece of information that travels through the system.…”
Section: Measuring Space-time Information Flow With Finite Perturbsupporting
confidence: 54%
See 1 more Smart Citation
“…7 have a slope equal to the information transfer velocity [Eq. (13)]; these lines closely match the perturbation wavefront. On the one hand this is to be expected since the perturbation can be considered as a piece of information that travels through the system.…”
Section: Measuring Space-time Information Flow With Finite Perturbsupporting
confidence: 54%
“…Even though every part of an STC system has the ability to change the evolution of the entire system, there exists an effective decoupling between distant parts; this can be quantified with correlation length scales [1,11], time-delayed mutual information between distant points [12,13], or transfer entropy [14]. This decoupling directly relates to Ruelle's claim [15] that extended chaotic systems without long-range interactions are uncorrelated at large length scales and therefore should behave as a sum of their parts.…”
Section: Introductionmentioning
confidence: 99%
“…Results (not shown) indicate that estimates of entropy are marginally sensitive to the number of bins. As a rule of thumb, [38,56] suggest using between 10 and 20 bins and more than 500-1000 data to estimate entropy, mutual information, and other information theory statistics in finite-interval bin-counting probability mass function estimation schemes. In our case, entropy and other related quantities were calculated with 100 bins, the optimal number obtained with the algorithm proposed by Knuth [37].…”
Section: Entropy Under Aggregation In Timementioning
confidence: 99%
“…where log was taken to be the natural log and p(x i , y j ) is the joint probability density that a measurement drawn from the signals x and y will result in the values of x i and y j [3,11]. The individual probability densities for the measurements x i in x and y i in y are given by p(x i ) and p(y j ), respectively.…”
Section: Average Mutual Informationmentioning
confidence: 99%