2022
DOI: 10.1007/978-3-031-20050-2_31
|View full text |Cite
|
Sign up to set email alerts
|

Continual Variational Autoencoder Learning via Online Cooperative Memorization

Abstract: Due to their inference, data representation and reconstruction properties, Variational Autoencoders (VAE) have been successfully used in continual learning classification tasks. However, their ability to generate images with specifications corresponding to the classes and databases learned during Continual Learning (CL) is not well understood and catastrophic forgetting remains a significant challenge. In this paper, we firstly analyze the forgetting behaviour of VAEs by developing a new theoretical framework … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(16 citation statements)
references
References 50 publications
0
16
0
Order By: Relevance
“…However, most existing DEMs require knowing the task identity to provide the auxiliary information for the expansion strategy (Ye and Bors 2021). Recently, DEMs have been shown to achieve promising results in TFCL (Ye and Bors 2022a). The first study of using DEMs for TFCL was proposed in (Rao et al 2019), which introduces a continual learning framework called the Continual Unsupervised Representation Learning (CURL).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…However, most existing DEMs require knowing the task identity to provide the auxiliary information for the expansion strategy (Ye and Bors 2021). Recently, DEMs have been shown to achieve promising results in TFCL (Ye and Bors 2022a). The first study of using DEMs for TFCL was proposed in (Rao et al 2019), which introduces a continual learning framework called the Continual Unsupervised Representation Learning (CURL).…”
Section: Related Workmentioning
confidence: 99%
“…A similar idea called the Continual Neural Dirichlet Process Mixture (CN-DPM) uses the Dirichlet processes for the VAE component expansion (Lee et al 2020). Moreover DEMs can further improve their performance by using an efficient sample selection approach called the Online Cooperative Memorization (OCM) (Ye and Bors 2022a). which employs a a dual memory system to store both short and long-term information.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Regularization and knowledge distillation methods have recently been considered to further improve performance (Aljundi et al 2019b;Chaudhry et al 2019b,a;Bang et al 2021Bang et al , 2022Cha, Lee, and Shin 2021;Kirkpatrick et al 2017;Kurle et al 2020;Li and Hoiem 2017). Although the memory-based approaches perform well, they suffer from the negative backward transfer when learning new samples (Ye and Bors 2022a). Such a drawback is solved by employing a dynamic network architecture which preserves prior knowledge into frozen parameters of specific units while building new hidden layers and units to learn novel tasks (Wen, Tran, and Ba 2020;Ye and Bors 2023b,a).…”
Section: Related Workmentioning
confidence: 99%
“…However, these methods perform worse when applied to learning a long-term data stream due to their fixed model capacity. Meanwhile, dynamic architecture models have achieved promising results (Lee et al 2020;Rao et al 2019;Ye and Bors 2022a). The first dynamic expansion model for TFCL, proposed in (Rao et al 2019), dynamically adds new inference models to capture data changes while using Generative Replay Mechanisms (GRMs) to alleviate forgetting.…”
Section: Related Workmentioning
confidence: 99%