In this paper, we document lognormal distributions for spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears as a functional property that is present everywhere. Secondly, we created a generic neural model to show that Hebbian learning will create and maintain lognormal distributions. We could prove with the model that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This settles a long-standing question about the type of plasticity exhibited by intrinsic excitability.
We present an unsupervised, local activation-dependent learning rule for intrinsic plasticity (IP) which affects the composition of ion channel conductances for single neurons in a use-dependent way. We use a single-compartment conductance-based model for medium spiny striatal neurons in order to show the effects of parameterization of individual ion channels on the neuronal activation function. We show that parameter changes within the physiological ranges are sufficient to create an ensemble of neurons with significantly different activation functions. We emphasize that the effects of intrinsic neuronal variability on spiking behavior require a distributed mode of synaptic input and can be eliminated by strongly correlated input. We show how variability and adaptivity in ion channel conductances can be utilized to store patterns without an additional contribution by synaptic plasticity (SP). The adaptation of the spike response may result in either "positive" or "negative" pattern learning. However, read-out of stored information depends on a distributed pattern of synaptic activity to let intrinsic variability determine spike response. We briefly discuss the implications of this conditional memory on learning and addiction.
Membrane receptors for neuromodulators (NM) are highly regulated in their distribution and efficacy -a phenomenon which influences the individual cell's response to central signals of NM release. Even though NM receptor regulation is implicated in the pharmacological action of many drugs, and is also known to be influenced by various environmental factors, its functional consequences and modes of action are not well understood. In this paper we summarize relevant experimental evidence on NM receptor regulation (specifically dopamine D1 and D2 receptors) in order to explore its significance for neural and synaptic plasticity. We identify the relevant components of NM receptor regulation (receptor phosphorylation, receptor trafficking and sensitization of second-messenger pathways) gained from studies on cultured cells. Key principles in the regulation and control of short-term plasticity (sensitization) are identified, and a model is presented which employs direct and indirect feedback regulation of receptor efficacy. We also discuss long-term plasticity which involves shifts in receptor sensitivity and loss of responsivity to NM signals. Finally, we discuss the implications of NM receptor regulation for models of brain plasticity and memorization. We emphasize that a realistic model of brain plasticity will have to go beyond Hebbian models of long-term potentiation and depression. Plasticity in the distribution and efficacy of NM receptors may provide another important source of functional plasticity with implications for learning and memory.
Neuromodulatory receptors in presynaptic position have the ability to suppress synaptic transmission for seconds to minutes when fully engaged. This effectively alters the synaptic strength of a connection. Much work on neuromodulation has rested on the assumption that these effects are uniform at every neuron. However, there is considerable evidence to suggest that presynaptic regulation may be in effect synapse-specific. This would define a second "weight modulation" matrix, which reflects presynaptic receptor efficacy at a given site. Here we explore functional consequences of this hypothesis. By analyzing and comparing the weight matrices of networks trained on different aspects of a task, we identify the potential for a low complexity "modulation matrix", which allows to switch between differently trained subtasks while retaining general performance characteristics for the task. This means that a given network can adapt itself to different task demands by regulating its release of neuromodulators. Specifically, we suggest that (a) a network can provide optimized responses for related classification tasks without the need to train entirely separate networks and (b) a network can blend a "memory mode" which aims at reproducing memorized patterns and a "novelty mode" which aims to facilitate classification of new patterns. We relate this work to the known effects of neuromodulators on brain-state dependent processing.
We present an unsupervised, local activation-dependent learning rule for intrinsic plasticity (IP) which affects the composition of ion channel conductances for single neurons in a use-dependent way. We use a single-compartment conductance-based model for medium spiny striatal neurons in order to show the effects of parameterization of individual ion channels on the neuronal activation function. We show that parameter changes within the physiological ranges are sufficient to create an ensemble of neurons with significantly different activation functions. We emphasize that the effects of intrinsic neuronal variability on spiking behavior require a distributed mode of synaptic input and can be eliminated by strongly correlated input. We show how variability and adaptivity in ion channel conductances can be utilized to store patterns without an additional contribution by synaptic plasticity (SP). The adaptation of the spike response may result in either "positive" or "negative" pattern learning. However, read-out of stored information depends on a distributed pattern of synaptic activity to let intrinsic variability determine spike response. We briefly discuss the implications of this conditional memory on learning and addiction.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.