2020
DOI: 10.1101/2020.06.01.127399
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Comparative performance of mutual information and transfer entropy for analyzing the balance of information flow and energy consumption at synapses

Abstract: words)Information theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive.Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. Howe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(13 citation statements)
references
References 29 publications
0
13
0
Order By: Relevance
“…Since the input connects to the first network unidirectionally, we can use direct mutual information to determine information transfer from the stimulus (Conrad and Jolivet, 2020). This approach enhances interpretability of the statistics compared with other information transfer methods.…”
Section: Information Transfermentioning
confidence: 99%
See 1 more Smart Citation
“…Since the input connects to the first network unidirectionally, we can use direct mutual information to determine information transfer from the stimulus (Conrad and Jolivet, 2020). This approach enhances interpretability of the statistics compared with other information transfer methods.…”
Section: Information Transfermentioning
confidence: 99%
“…4B). Since the input was connected to the first network unidirectionally, and we expected a monotonous relationship between the input and the oscillatory network activity, we used Spearman correlation to determine information transfer from the stimulus, which improves the interpretability of the statistics compared with other information transfer methods (Conrad and Jolivet, 2020). As such, we correlated the input signal with the amplitude envelope of the two networks.…”
Section: Phase Coupling Emerges At Criticality and Extends Into (Slig...mentioning
confidence: 99%
“…For the first type of measure, two related quantities are typically used to quantify the information throughput of a neural system: the symmetric mutual information I X,Y between input X and output Y (Shannon (1948), Eq 14), and the directed transfer entropy T X→Y from input X to output Y (Schreiber (2000), Eq 15). Both have their advantages and a recent review found that the transfer entropy is typically less biased when data is sparse (Conrad and Jolivet, 2021).…”
Section: Sparsity Typically Increases Standard Measures Of Pattern Se...mentioning
confidence: 99%
“…The additional term in the description of the NMDA conductance is added to describe the nonlinear I-V relation of NMDA receptors due to the Mg 2+ block [10]. In each case, g AMPA and g NMDA are determined by the procedure mentioned above combining Equations [5] and [6]. An example of what that procedure yields can be observed in the top two panels of Fig.…”
Section: Modelling Synaptic Input Synaptic Depression and Neuromodulationmentioning
confidence: 99%
“…We refer the reader to refs. [11,6] for a comparative discussion of these measures in a context similar to the one discussed here. The energy consumption in thalamic relay cells in this scenario arises from presynaptic activity and from the generation of output action potentials.…”
Section: Information Flow and Neuroenergeticsmentioning
confidence: 99%