words)Information theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive.Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.
2
Author summary (265 words)Information theory has become an essential tool of modern neuroscience. It is being routinely used to evaluate how much information flows from external stimuli to various brain regions or individual neurons. It is also used to evaluate how information flows between brain regions, between neurons, across synapses, or in neural networks. Information theory offers multiple measures to do that. Two of the most popular are mutual information and transfer entropy. While these measures are related to each other, they differ in one important aspect: transfer entropy reports a directional flow of information, as mutual information does not. Here, we proceed to a systematic evaluation of their respective performances and trade-offs from the perspective of an experimentalist looking to apply these measures to binarized spike trains. We show that transfer entropy might be a better choice than mutual information when time for experimental data collection is limited, as it appears less affected by systematic biases induced by a relative lack of data. Transmission delays and integration properties of the output neuron can however complicate this picture, and we provide an example of the effect this has on both measures. We conclude that when time and experimental conditions permit, mutual information -especially when estimated using a method referred to as the 'direct' method -might provide an easier to interpret alternative.Finally, we apply both measures in the biophysical context of evaluating the energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures capture the original experimental finding that those synapses are tuned...