2020
DOI: 10.48550/arxiv.2011.14960
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BinPlay: A Binary Latent Autoencoder for Generative Replay Continual Learning

Abstract: We introduce a binary latent space autoencoder architecture to rehearse training samples for the continual learning of neural networks. The ability to extend the knowledge of a model with new data without forgetting previously learned samples is a fundamental requirement in continual learning. Existing solutions address it by either replaying past data from memory, which is unsustainable with growing training data, or by reconstructing past samples with generative models that are trained to generalize beyond t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…In this section, the proposed HyBNN and FedHyBNN are tested on the MNIST dataset. For the experiment, the VAE of HyBNN follows the same architecture as BinPlay Deja et al [2020], the encoder consists of three convolutional layers with batch normalization followed by an output layer consisting of 200 neurons. The binary latent space consists of merely 200 bits or 25 bytes.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In this section, the proposed HyBNN and FedHyBNN are tested on the MNIST dataset. For the experiment, the VAE of HyBNN follows the same architecture as BinPlay Deja et al [2020], the encoder consists of three convolutional layers with batch normalization followed by an output layer consisting of 200 neurons. The binary latent space consists of merely 200 bits or 25 bytes.…”
Section: Resultsmentioning
confidence: 99%
“…It has been successfully applied on images Carreira-Perpinán and Raziperchikolaei [2015] and on videos Song et al [2018] to produce state-of-the-art results. Recently, binary autoencoders have been used for continual learning, specifically Deja et al Deja et al [2020] used them to create unique sequential binary codes for each input sample, similar to hashing and then using older hashes for generative replay to solve the problem of catastrophic forgetting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, we condition our generative model on task number. For that purpose, we use binary encoding with co-prime numbers as proposed in [6].…”
Section: Evaluation Setupmentioning
confidence: 99%