2021
DOI: 10.1109/tnnls.2020.3007548
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Network With Developmental Memory for Continual Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…CL has already been successfully applied in a domain of interest for our study: image classification. For instance, [17] explore the usage of developmental memories for the damping of forgetting, and [18] apply CL by using transfer learning and k-nearest neighbor. While applications of CL for pattern recognition on accelerometer sensors are still pretty new, this field has been explored only in some standard TinyML applications [19].…”
Section: Related Workmentioning
confidence: 99%
“…CL has already been successfully applied in a domain of interest for our study: image classification. For instance, [17] explore the usage of developmental memories for the damping of forgetting, and [18] apply CL by using transfer learning and k-nearest neighbor. While applications of CL for pattern recognition on accelerometer sensors are still pretty new, this field has been explored only in some standard TinyML applications [19].…”
Section: Related Workmentioning
confidence: 99%
“…Using a Bayesian NN, CBLN [17] preserves distinctive parameters for different datasets for retaining performance. Similarly, [4] introduced developmental memory (DM) into a CNN, continually growing submemory networks to preserve important features of learned tasks while allowing faster learning. Each sub-memory can store task-specific knowledge by using a memory loss function and preserve it during continual adaptations.…”
Section: Related Workmentioning
confidence: 99%
“…Methods to alleviate catastrophic forgetting can be classified into many categories, such as memory-based method [3], [4], [5], [6] that preserve activation of learned data, and penalization method that penalize change of learned knowledge through regularization [7] or modularization [8]. Among these methods, modularity-based parametric isolation method is particularly appealing, which is immune to catastrophic forgetting due to freezing parameters after a task has been learned Fig.…”
Section: Introductionmentioning
confidence: 99%
“…A rotated version of the training dataset is now assumed as the new batch of data that arrived over time. Figure (5) shows the training data in the top row and test data in the bottom row. Now, since the first CNN has a data augmentation module, the test data would not provide and specific challenge as the network has rotation invariant capabilities.…”
Section: Experiments For Dataset Distancementioning
confidence: 99%