2023
DOI: 10.1113/jp281510
|View full text |Cite
|
Sign up to set email alerts
|

Silences, spikes and bursts: Three‐part knot of the neural code

Zachary Friedenberger,
Emerson Harkin,
Katalin Tóth
et al.

Abstract: When a neuron breaks silence, it can emit action potentials in a number of patterns. Some responses are so sudden and intense that electrophysiologists felt the need to single them out, labelling action potentials emitted at a particularly high frequency with a metonym – bursts. Is there more to bursts than a figure of speech? After all, sudden bouts of high‐frequency firing are expected to occur whenever inputs surge. The burst coding hypothesis advances that the neural code has three syllables: silences, spi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 322 publications
0
4
0
Order By: Relevance
“…[29,30,31,32]. For spiking neural networks, burst-dependent algorithms exploit the target-dependence of short-term plasticity [33,34,35], dendrite-dependent bursting [36,37] and burst-dependent plasticity [13,12,14,15] to perform credit assignment in a way that approximates backpropagation [10,11,38]. These burst-dependent learning methods, referred to as 'Burstprop' algorithms, can in principle allow for multilayer local learning in SNNs.…”
Section: 12mentioning
confidence: 99%
See 1 more Smart Citation
“…[29,30,31,32]. For spiking neural networks, burst-dependent algorithms exploit the target-dependence of short-term plasticity [33,34,35], dendrite-dependent bursting [36,37] and burst-dependent plasticity [13,12,14,15] to perform credit assignment in a way that approximates backpropagation [10,11,38]. These burst-dependent learning methods, referred to as 'Burstprop' algorithms, can in principle allow for multilayer local learning in SNNs.…”
Section: 12mentioning
confidence: 99%
“…One promising group of local learning methods for training SNNs are burstdependent or 'Burstprop' learning methods [10,11]. These algorithms are grounded in the widely observed burst-dependence of synaptic plasticity [12,13,14,15] and synaptic dynamics which confers synapses with the ability to communicate spike-timing patterns differently to different targets. In hierarchical networks, Burstprop can approximate the backpropagation of error algorithm with single-phase local learning.…”
Section: Introductionmentioning
confidence: 99%
“…A general model of the state of neurons or other elements of a single dynamical system 6 receiving external inputs can be written in the form of a multidimensional ordinary differential equation (ODE), ẋ = f x; w, η(t) (3.11) with x ∈ R X a highly dimensional state variable representing the activity of neurons or neuron populations (e.g., membrane potential or firing rate), and where w stands for connectivity parameters (and maybe others). For example, in a model of the human brain, each coordinate may represent the average firing rate of populations of neurons (a neural mass model) or a more detailed ternary code of silences, spikes, and bursts in each cell [75]. This finite-dimensional equation can be generalized to a partial differential equation (a neural field equation), but these important details are not essential in what follows.…”
Section: General Modelmentioning
confidence: 99%
“…24 Determining the precise rules governing creation of dendritic plateau potentials is thus a critical step in understanding the basis of memory formation. 25,26 Specifically, one would like to predict whether a given spatial pattern of synaptic inputs and temporal sequence of bAPs will produce a dendritic plateau potential, and how the different dendritic ion channels contribute to this process. To-date, the underlying spatial structures of these electrical events have not been visualized, so it is unclear how bAPs and synaptic inputs interact within the dendritic tree, 27,28 and how the various ion channels and excitations work together to implement computationally meaningful learning rules.…”
mentioning
confidence: 99%