2008
DOI: 10.1103/physreve.78.021141
|View full text |Cite
|
Sign up to set email alerts
|

Effects of correlated variability on information entropies in nonextensive systems

Abstract: We have calculated the Tsallis entropy and Fisher information matrix (entropy) of spatially correlated nonextensive systems, by using an analytic non-Gaussian distribution obtained by the maximum entropy method. The effects of the correlated variability on the Fisher information matrix are shown to be different from those on the Tsallis entropy. The Fisher information is increased (decreased) by a positive (negative) correlation, whereas the Tsallis entropy is decreased with increasing absolute magnitude of th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2008
2008
2011
2011

Publication Types

Select...
6

Relationship

5
1

Authors

Journals

citations
Cited by 8 publications
(15 citation statements)
references
References 29 publications
(65 reference statements)
0
15
0
Order By: Relevance
“…The point (iii) shows that if input information is carried by synchrony within the population code hypothesis [50,51], its decoding accuracy may be improved either by small or large correlation, independently of q [the point (v)]. Our calculation concerns the long-standing controversy on a role of the synchrony in neuronal ensembles [5]- [12].…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…The point (iii) shows that if input information is carried by synchrony within the population code hypothesis [50,51], its decoding accuracy may be improved either by small or large correlation, independently of q [the point (v)]. Our calculation concerns the long-standing controversy on a role of the synchrony in neuronal ensembles [5]- [12].…”
Section: Discussionmentioning
confidence: 99%
“…(1) [24,25]. We derive the PDF, p(x), by using the OLM-MEM [29] for the Tsallis entropy, imposing the constraints given by [49] …”
Section: Maximum Entropy Methods a Probability Distribution Functionmentioning
confidence: 99%
See 3 more Smart Citations