1999
DOI: 10.1016/s0167-2789(98)00269-3
|View full text |Cite
|
Sign up to set email alerts
|

Estimating the errors on measured entropy and mutual information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
178
0
1

Year Published

2005
2005
2017
2017

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 253 publications
(182 citation statements)
references
References 16 publications
3
178
0
1
Order By: Relevance
“…n j log n j þ log m; ð2:2Þ with the error due to a finite number of data points estimated as (m -1)/2M [19], where M is the total number of data points. Next, we compared I tot and I ext .…”
Section: Resultsmentioning
confidence: 99%
“…n j log n j þ log m; ð2:2Þ with the error due to a finite number of data points estimated as (m -1)/2M [19], where M is the total number of data points. Next, we compared I tot and I ext .…”
Section: Resultsmentioning
confidence: 99%
“…To establish functional connectivity one has to show that the statistical dependencies are significant. This entails, in its most general formulation, measuring the mutual information among two or more time-series (Roulston, 1999;Quian Quiroga et al, 2002). There are several approaches to assessing mutual information, which divide broadly into linear and non-linear.…”
Section: Introductionmentioning
confidence: 99%
“…The first group is identified by a strong attractive behaviour for the food disks (seekers), whereas the second group is identified by a strong attractive behaviour for others agent with food (parasites). We estimate the M I 4 in stage 1 and stage 3 for every agent by using the corrected standard deviation formula [19]. Before learning (Fig.3 (A),(D)) the reflex-output loop predominates over the predictor-output loop for both the food attraction behaviour and the others attraction behaviour:…”
Section: Resultsmentioning
confidence: 99%