2020
DOI: 10.3389/fncom.2020.00033
|View full text |Cite
|
Sign up to set email alerts
|

Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell

Abstract: The perceptron learning algorithm and its multiple-layer extension, the backpropagation algorithm, are the foundations of the present-day machine learning revolution. However, these algorithms utilize a highly simplified mathematical abstraction of a neuron; it is not clear to what extent real biophysical neurons with morphologically-extended non-linear dendritic trees and conductance-based synapses can realize perceptron-like learning. Here we implemented the perceptron learning algorithm in a realistic bioph… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
30
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 31 publications
(32 citation statements)
references
References 68 publications
2
30
0
Order By: Relevance
“…Mushroom-like spines would standardize local postsynaptic potentials throughout the dendritic tree and reduce the location-dependent variability of excitatory responses ( Gulledge et al, 2012 ). Other modeled distal synapses may not impact the cell’s output ( Moldwin and Segev, 2019 ). Ramified spines have additional functional possibilities by displaying postsynaptic receptors on different parts of the spine heads ( Verzi and Noris, 2009 ) with likely temporal and spatial specificity and signaling microdomains ( Newpher and Ehlers, 2009 ; Chen and Sabatini, 2012 ).…”
Section: Dendrites and Spines In Pyramidal Neuronsmentioning
confidence: 99%
“…Mushroom-like spines would standardize local postsynaptic potentials throughout the dendritic tree and reduce the location-dependent variability of excitatory responses ( Gulledge et al, 2012 ). Other modeled distal synapses may not impact the cell’s output ( Moldwin and Segev, 2019 ). Ramified spines have additional functional possibilities by displaying postsynaptic receptors on different parts of the spine heads ( Verzi and Noris, 2009 ) with likely temporal and spatial specificity and signaling microdomains ( Newpher and Ehlers, 2009 ; Chen and Sabatini, 2012 ).…”
Section: Dendrites and Spines In Pyramidal Neuronsmentioning
confidence: 99%
“…2B). In particular, they provide substantially better performance than the perceptron learning rule [Moldwin and Segev, 2020]. The reason is that the latter is guaranteed to work well only if the training examples are linearly separable, in particular if they do not contain contradicting examples that commonly occur in the case of probabilistic predictions.…”
Section: Discussionmentioning
confidence: 99%
“…In order to facilitate visualization, we consider here just two-dimensional firing patterns x, but the same principle applies however for input patterns of arbitrary dimension. These two point clouds are not linearly separable, and hence the perceptron learning rule (as investigated in [Moldwin and Segev, 2020]) cannot do a good job in separating them, as indicated by the red trace in Fig. 2B.…”
Section: A Simple Illustration Of the Capability Of Dlr To Enable Probabilistic Predictionsmentioning
confidence: 99%
“…We used the simulated spike trains of the postsynaptic neuron and the population as ground-truth data for deriving connection weights using a perceptron algorithm [51,52]. At each iteration of the algorithm, the derived connection weights were updated using the perceptron learning rule:…”
Section: Deriving Connectivitymentioning
confidence: 99%