Understanding the operations of neural networks in the brain requires an understanding of whether interactions among neurons can be described by a pairwise interaction model, or whether a higher order interaction model is needed. In this article we consider the rate of synchronous discharge of a local population of neurons, a macroscopic index of the activation of the neural network that can be measured experimentally. We analyse a model based on physics' maximum entropy principle that evaluates whether the probability of synchronous discharge can be described by interactions up to any given order. When compared with real neural population activity obtained from the rat somatosensory cortex, the model shows that interactions of at least order three or four are necessary to explain the data. We use Shannon information to compute the impact of high-order correlations on the amount of somatosensory information transmitted by the rate of synchronous discharge, and we find that correlations of higher order progressively decrease the information available through the neural population. These results are compatible with the hypothesis that high-order interactions play a role in shaping the dynamics of neural networks, and that they should be taken into account when computing the representational capacity of neural populations.
The spiking activity of nearby cortical neurons is not independent. Numerous studies have explored the importance of this correlated responsivity for visual coding and perception, often by comparing the information conveyed by pairs of simultaneously recorded neurons with the sum of information provided by the respective individual cells. Pairwise responses typically provide slightly more information so that encoding is weakly synergistic. The simple comparison between pairwise and summed individual responses conflates several forms of correlation, however, making it impossible to judge the relative importance of synchronous spiking, basic tuning properties, and stimulus-independent and stimulus-dependent correlation. We have applied an information theoretic approach to this question, using the responses of pairs of neurons to drifting sinusoidal gratings of different directions and contrasts that have been recorded in the primary visual cortex of anesthetized macaque monkeys. Our approach allows us to break down the information provided by pairs of neurons into a number of components. This analysis reveals that, although synchrony is prevalent and informative, the additional information it provides frequently is offset by the redundancy arising from the similar tuning properties of the two cells. Thus coding is approximately independent with weak synergy or redundancy arising, depending on the similarity in tuning and the temporal precision of the analysis. We suggest that this would allow cortical circuits to enjoy the stability provided by having similarly tuned neurons without suffering the penalty of redundancy, because the associated information transmission deficit is compensated for by stimulus-dependent synchrony.
Various nuclear structure observables are evaluated employing low-momentum nucleonnucleon (NN) potentials V low−k derived from the CD-Bonn and Nijmegen NN interactions V N N . By construction, the high momentum modes of the original V N N are integrated out in V low−k , with the requirement that the deuteron binding energy and low energy phase shifts of V N N are exactly reproduced. Using this interaction, we evaluate the bulk properties (binding energy and saturation density) of nuclear matter and finite nuclei, in particular their dependence on the cut-off parameter. We also study the pairing gap and the residual interaction in nuclear matter in terms of the Landau parametrization. At low and medium densities, the HF and BHF binding energies for nuclear matter calculated with the V low−k 's derived from the CD-Bonn and Nijmegen potentials are nearly identical. The pairing gaps and Landau parameters derived from V low−k are remarkably close to those given by the full-space V N N . The V low−k interactions, however, fail to reproduce the saturation property of nuclear matter at higher densities if the cut-off for the high momentum modes is assumed density independent.
Population coding is the quantitative study of which algorithms or representations are used by the brain to combine together and evaluate the messages carried by different neurons. Here, we review an information-theoretic approach to population coding. We first discuss how to compute the information carried by simultaneously recorded neural populations, and in particular how to reduce the limited sampling bias which affects calculation of information from a limited amount of experimental data. We then discuss how to quantify the contribution of individual members of the population, or the interaction between them, to the overall information encoded by the considered group of neurons. We focus in particular on evaluating what is the contribution of interactions up to any given order to the total information. We illustrate this formalism with applications to simulated data with realistic neuronal statistics and to real simultaneous recordings of multiple spike trains.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.