2022
DOI: 10.1371/journal.pcbi.1010628
|View full text |Cite
|
Sign up to set email alerts
|

Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

Abstract: Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when tra… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
3
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 73 publications
0
3
0
1
Order By: Relevance
“…Several methods have been proposed to overcome these limitations. These include data augmentation (Shorten and Khoshgoftaar 2019), pretraining on other datasets (Zhuang et al 2020) or alternative architectures such as neural tangent kernel (Arora et al 2019). However, these approaches do not address the fundamental question of how to make overparameterized deep learning networks learn to generalize from small datasets without overfitting.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Several methods have been proposed to overcome these limitations. These include data augmentation (Shorten and Khoshgoftaar 2019), pretraining on other datasets (Zhuang et al 2020) or alternative architectures such as neural tangent kernel (Arora et al 2019). However, these approaches do not address the fundamental question of how to make overparameterized deep learning networks learn to generalize from small datasets without overfitting.…”
Section: Introductionmentioning
confidence: 99%
“…Two critical components which are believed to underlie memory consolidation during sleep are spontaneous replay of memory traces and local unsupervised synaptic plasticity that restricts synaptic changes to relevant memories only. During sleep, replay of recently learned memories along with relevant old memories enables the network to form stable long-term memory representations (Rasch and Born 2013) and reduces competition between memories (González et al 2020;Golden et al 2022). The idea of replay has been explored in machine learning to enable continual learning.…”
Section: Introductionmentioning
confidence: 99%
“…While many methods seek to prevent catastrophic forgetting [ 33 ], such as Elastic Weight Consolidation (EWC) [ 34 ], algorithms such as LIME [ 35 ], and even replay during sleep [ 36 ], it is still argued that catastrophic forgetting has not been solved [ 37 ]. If catastrophic forgetting is due to a lack of modularization of information, it becomes crucial to accurately measure this modularization to identify learning schemes that promote modules.…”
Section: Introductionmentioning
confidence: 99%
“…For this reason, several works propose approaches for mitigating catastrophic forgetting in this type of network. Most of the existing methods are based on parameter regularization [7,8,9,10] and on the optimization process [11], while [12] uses off-line reactivation to consolidate memories from previous tasks. To the best of the author's knowledge, there is no prior work using memory replay to tackle catastrophic forgetting.…”
Section: Introductionmentioning
confidence: 99%