2008
DOI: 10.1103/physreve.77.026205
|View full text |Cite
|
Sign up to set email alerts
|

Transmission of information in active networks

Abstract: Shannon's capacity theorem is the main concept behind the theory of communication. It says that if the amount of information contained in a signal is smaller than the channel capacity of a physical media of communication, it can be transmitted with arbitrarily small probability of error. This theorem is usually applicable to ideal channels of communication in which the information to be transmitted does not alter the passive characteristics of the channel that basically tries to reproduce the source of informa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
56
1

Year Published

2009
2009
2020
2020

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(63 citation statements)
references
References 39 publications
6
56
1
Order By: Relevance
“…It is interesting to note that the high information being transferred requires both synchronization and desynchronization at different frequency bands, as suggested by computer simulations of neuron models. Specifically, as shown in Baptista and Kurths (2008), the maximum capacity of information transfer between neurons can be reached if the neurons synchronize at Figure 6a (a, b); in visual condition, as identified in Figure 7a (c, d). The plots (a) and (c) shows the connections that directly support the identified contrasts at 8 -11 Hz (with the bootstrap ratio values Ͼ7).…”
Section: Discussionmentioning
confidence: 99%
“…It is interesting to note that the high information being transferred requires both synchronization and desynchronization at different frequency bands, as suggested by computer simulations of neuron models. Specifically, as shown in Baptista and Kurths (2008), the maximum capacity of information transfer between neurons can be reached if the neurons synchronize at Figure 6a (a, b); in visual condition, as identified in Figure 7a (c, d). The plots (a) and (c) shows the connections that directly support the identified contrasts at 8 -11 Hz (with the bootstrap ratio values Ͼ7).…”
Section: Discussionmentioning
confidence: 99%
“…In those networks MI and the synchronization level increase simultaneously as the whole system entropy decreases. The maximum of MI, is achieved for the same coupling strength for which complete synchronization appears [22][23][24][25]. Hence, MI can be used to distinguish synchronous and asynchronous states in coupled stochastic networks.…”
Section: Detecting Correlations and Phase Synchronization With Mutualmentioning
confidence: 99%
“…In [18,19], such a localization of the STPs (obtained from a sufficiently long time series) is mathematically described in the following manner. Let the distribution of the STPs be included in a set D on the unit circle; D is localized if there exist open sets Λ i on the circle such that D ∩ i Λ i = ∅.…”
Section: Bursting Neuron Modelmentioning
confidence: 99%