Proceedings of the 2019 3rd International Conference on Computer Science and Artificial Intelligence 2019
DOI: 10.1145/3374587.3375880
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Image Classification Using Hebbian Plasticity Principles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 8 publications
0
6
0
Order By: Relevance
“…In the proposed classifier, transfer learning is applied in the CNN model to improve learning efficiency [37][38][39][40]. In transfer learning, information that has been previously learned while solving a problem is transferred and reused to solve another related problem.…”
Section: Transfer Learningmentioning
confidence: 99%
“…In the proposed classifier, transfer learning is applied in the CNN model to improve learning efficiency [37][38][39][40]. In transfer learning, information that has been previously learned while solving a problem is transferred and reused to solve another related problem.…”
Section: Transfer Learningmentioning
confidence: 99%
“…Before we have mentioned the idea of combining error-driven learning with Hebbian learning in order to combine task-specific and general knowledge. This idea was also shown to be effective in the context of transfer learning [128,129] and metalearning [138], thanks to the transferability of unsupervised Hebbian features. In particular, in [138] the differentiable plasticity model was introduced, where a synapse is assumed to be composed of two parts: a backprop weight and a Hebbian weight.…”
Section: Beyond Backprop Approximationsmentioning
confidence: 99%
“…The results suggested that Hebbian learning is suitable for training early feature detectors, as well as higher network layers, but not very effective for training intermediate network layers. Furthermore, Hebbian learning was successfully used to retrain the higher layers of a pre-trained network, achieving results comparable to backprop, but requiring fewer training epochs, thus suggesting potential applications in the context of transfer learning (see also [32,156,157]). Some contributions [132,137] showed promising results of unsupervised Hebbian algorithms for semi-supervised network training, in learning scenarios with scarce data availability, achieving superior results compared to other backprop-based unsupervised methods for semi-supervised training such as Variational Auto-Encoders (VAE) [118].…”
Section: Synaptic Plasticity Models In Deep Learningmentioning
confidence: 99%
“…The results suggested that Hebbian learning is suitable for training early feature detectors, as well as higher network layers, but not very effective for training intermediate network layers. Furthermore, Hebbian learning was successfully used to retrain the higher layers of a pre-trained network, achieving results comparable to backprop, but requiring fewer training epochs, thus suggesting potential applications in the context of transfer learning (see also [28,29,30]).…”
Section: Hebbian Learningmentioning
confidence: 99%