2022
DOI: 10.1007/978-3-030-95470-3_28
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Hebbian Learning in a Semi-supervised Setting

Abstract: We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pretraining stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of ava… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…The details of Hebbian learning theories are outside the scope of this paper, although a plethora of Hebbian-based learning algorithms have been proposed. We refer the interested reader to [1,13,[24][25][26][27]. In particular, in our previous works [24,25], we have highlighted the merits of the nonlinear Hebbian Principal Component Analysis (HPCA).…”
Section: Unsupervised Hebbian Pre-trainingmentioning
confidence: 99%
See 3 more Smart Citations
“…The details of Hebbian learning theories are outside the scope of this paper, although a plethora of Hebbian-based learning algorithms have been proposed. We refer the interested reader to [1,13,[24][25][26][27]. In particular, in our previous works [24,25], we have highlighted the merits of the nonlinear Hebbian Principal Component Analysis (HPCA).…”
Section: Unsupervised Hebbian Pre-trainingmentioning
confidence: 99%
“…We refer the interested reader to [1,13,[24][25][26][27]. In particular, in our previous works [24,25], we have highlighted the merits of the nonlinear Hebbian Principal Component Analysis (HPCA). This is derived by minimizing the representation error:…”
Section: Unsupervised Hebbian Pre-trainingmentioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, researchers took again inspiration from biology, in order to find new learning solutions as alternatives to backprop. The goal was not only to address the problem of SNN training [33,148], but also to discover novel approaches to the learning problem [77,139,182], and possibly more data efficient strategies [69,90,[105][106][107]110]. In fact, another limitation of current DL solutions is the requirement of Fig.…”
Section: Introductionmentioning
confidence: 99%