Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task--extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from the point of view of the organism, culminating in algorithms for real-time stimulus estimation based on a single example of the spike train. These methods were applied to an identified movement-sensitive neuron in the fly visual system. Such decoding experiments determined the effective noise level and fault tolerance of neural computation, and the structure of the decoding algorithms suggested a simple model for real-time analog signal processing with spiking neurons.
Adaptation is a widespread phenomenon in nervous systems, providing flexibility to function under varying external conditions. Here, we relate an adaptive property of a sensory system directly to its function as a carrier of information about input signals. We show that the input/output relation of a sensory system in a dynamic environment changes with the statistical properties of the environment. Specifically, when the dynamic range of inputs changes, the input/output relation rescales so as to match the dynamic range of responses to that of the inputs. We give direct evidence that the scaling of the input/output relation is set to maximize information transmission for each distribution of signals. This adaptive behavior should be particularly useful in dealing with the intermittent statistics of natural signals.
The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy-like quantities from small samples. We apply a recently introduced Bayesian entropy estimator to synthetic data inspired by experiments, and to real experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.
We assessed the performance of a synapse that transmits small, sustained, graded potentials between two classes of second-order ocellar "L-neurons" of the locust. We characterized the transmission of both fixed levels of membrane potential and fluctuating signals by recording postsynaptic responses to changes in presynaptic potential. To ensure repeatability between stimuli, we controlled presynaptic signals with a voltage clamp. We found that the synapse introduces noise above the level of background activity in the postsynaptic neuron. By driving the presynaptic neuron with slow-ramp changes in potential, we found that the number of discrete signal levels the synapse transmits is ϳ20. It can also transmit ϳ20 discrete levels when the presynaptic signal is a graded rebound spike. Synaptic noise level is constant over the operating range of the synapse, which would not be expected if presynaptic potential set the probability for the release of individual quanta of neurotransmitter according to Poisson statistics. Responses to individual quanta of neurotransmission could not be resolved, which is consistent with a synapse that operates with large numbers of vesicles evoking small responses. When challenged with white noise stimuli, the synapse can transmit information at rates up to 450 bits/s, a performance that is sufficient to transmit natural signals about changes in illumination.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.