Neural codes have been postulated to build efficient representations of the external world. The hippocampus, an encoding system, employs neuronal firing rates and spike phases to encode external space. Although the biophysical origin of such codes is at a single neuronal level, the role of neural components in efficient coding is not understood. The complexity of this problem lies in the dimensionality of the parametric space encompassing neural components, and is amplified by the enormous biological heterogeneity observed in each parameter. A central question that spans encoding systems therefore is how neurons arrive at efficient codes in the face of widespread biological heterogeneities. To answer this, we developed a conductance-based spiking model for phase precession, a phase code of external space exhibited by hippocampal place cells. Our model accounted for several experimental observations on place cell firing and electrophysiology: the emergence of phase precession from exact spike timings of conductance-based models with neuron-specific ion channels and receptors; biological heterogeneities in neural components and excitability; the emergence of subthreshold voltage ramp, increased firing rate, enhanced theta power within the place field; a signature reduction in extracellular theta frequency compared to its intracellular counterpart; and experience-dependent asymmetry in firing-rate profile. We formulated phase-coding efficiency, using Shannon's information theory, as an information maximization problem with spike phase as the response and external space within a single place field as the stimulus. We employed an unbiased stochastic search spanning an 11-dimensional neural space, involving thousands of iterations that accounted for the biophysical richness and neuron-to-neuron heterogeneities. We found a small subset of models that exhibited efficient spatial information transfer through the phase code, and investigated the distinguishing features of this subpopulation at the parametric and functional scales. At the parametric scale, which spans the molecular components that defined the neuron, several nonunique parametric combinations with weak pairwise correlations yielded models with similar high phase-coding efficiency. Importantly, placing additional constraints on these models in terms of matching other aspects of hippocampal neural responses did not hamper parametric degeneracy. We provide quantitative evidence demonstrating this parametric degeneracy to be a consequence of a many-to-one relationship between the different parameters and phase-coding efficiency. At the functional scale, involving the cellular-scale neural properties, our analyses revealed an important higher-order constraint that was exclusive to models exhibiting efficient phase coding. Specifically, we found a counterbalancing negative correlation between neuronal gain and the strength of external synaptic inputs as a critical functional constraint for the emergence of efficient phase coding. These observations implicate...
Hippocampal place cells encode space through phase precession, whereby neuronal spike phase progressively advances during place-field traversals. What neural constraints are essential for achieving efficient transfer of information through such phase codes, while concomitantly maintaining signature neuronal excitability? Here, we developed a conductance-based model for phase precession within the temporal sequence compression framework, and defined phase-coding efficiency using information theory. We recruited an unbiased stochastic search strategy to generate thousands of models, each with distinct intrinsic properties but receiving inputs with identical temporal structure. We found phase precession and associated efficiency to be critically reliant on neuronal intrinsic properties.Despite this, disparate parametric combinations with weak pair-wise correlations resulted in models with similar high-efficiency phase codes and similar excitability characteristics.Mechanistically, the emergence of such parametric degeneracy was dependent on two factors.First, the dependence of phase-coding efficiency on individual ion channels was differential and variable across models. Second, phase-coding efficiency manifested weak dependence independently on either intrinsic excitability or synaptic strength, instead emerging through synergistic interactions between synaptic and intrinsic properties. Despite these variable dependencies, our analyses predicted a dominant role for calcium-activated potassium channels in regulating phase precession and coding efficiency. Finally, we demonstrated that a change in afferent statistics, manifesting as input asymmetry, introduces an adaptive shift in the phase code that preserved its efficiency. Our study unveils a critical role for neuronal intrinsic properties in achieving phase-coding efficiency, while postulating degeneracy as a framework to attain the twin goals of efficient encoding and robust homeostasis.
Disruptions in the normal rhythmic functioning of the heart, termed as arrhythmia, often result from qualitative changes in the excitation dynamics of the organ. The transitions between different types of arrhythmia are accompanied by alterations in the spatiotemporal pattern of electrical activity that can be measured by observing the time-intervals between successive excitations of different regions of the cardiac tissue. Using biophysically detailed models of cardiac activity we show that the distribution of these time-intervals exhibit a systematic change in their skewness during such dynamical transitions. Further, the leading digits of the normalized intervals appear to fit Benford's law better at these transition points. This raises the possibility of using these observations to design a clinical indicator for identifying changes in the nature of arrhythmia. More importantly, our results reveal an intriguing relation between the changing skewness of a distribution and its agreement with Benford's law, both of which have been independently proposed earlier as indicators of regime shift in dynamical systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.