2019
DOI: 10.1109/jxcdc.2019.2911135
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks

Abstract: Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as spike-timing-dependent plasticity (… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
46
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 35 publications
(47 citation statements)
references
References 36 publications
1
46
0
Order By: Relevance
“…with a memristive-based network with PCM synapses, in terms of area, energy consumption and testing efficiency (Munoz-Martin et al, 2019). Figure 10 shows the classification results of the continual learning accuracy for every combination of two non-trained classes of the MNIST (a) and the Fashion-MNIST (c) datasets.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…with a memristive-based network with PCM synapses, in terms of area, energy consumption and testing efficiency (Munoz-Martin et al, 2019). Figure 10 shows the classification results of the continual learning accuracy for every combination of two non-trained classes of the MNIST (a) and the Fashion-MNIST (c) datasets.…”
Section: Resultsmentioning
confidence: 99%
“…However, the FPGA-based fully digital approach is not the only feasible way to perform continual learning. In particular, other works have described the possibility of implementing a hybrid supervisedunsupervised neural network using a PCM-based approach (Bianchi et al, 2019;Munoz-Martin et al, 2019). PCM devices are among the best candidates for building efficient synaptic elements, especially for their 3D stacking integration and multilevel programming capability (Kuzum et al, 2013).…”
Section: Discussion and Comparison With Memristive-based Approachesmentioning
confidence: 99%
See 2 more Smart Citations
“…Importantly, note that the weight change in the 1T1R synapse can be induced only via spike overlap, hence only for delays in the range −10 ms < Δt < 10 ms in this experiment [152]. Although the STDP characteristics achieved in the 1T1R RRAM synapse [151,152] display a squared shape due to binary operation of the RRAM cell instead of the exponentially decaying behavior observed in biological experiments, the plasticity of the 1T1R synapse was exploited in Although the STDP characteristics achieved in the 1T1R RRAM synapse [151,152] display a squared shape due to binary operation of the RRAM cell instead of the exponentially decaying behavior observed in biological experiments, the plasticity of the 1T1R synapse was exploited in many SNN implementations enabling neuromorphic tasks, such as unsupervised learning of space/spatiotemporal patterns [151,152,154,155], the extraction of auditory/visual patterns [156,157], pattern classification [158][159][160], and associative memory [161][162][163], in both simulation and hardware. Figure 16a shows the schematic representation of the RRAM-based SNN used in ref.…”
Section: Snns With Memristive Synapsesmentioning
confidence: 99%