2018
DOI: 10.48550/arxiv.1810.10999
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reversible Recurrent Neural Networks

Matthew MacKay,
Paul Vicol,
Jimmy Ba
et al.

Abstract: Recurrent neural networks (RNNs) provide state-of-the-art performance in processing sequential data but are memory intensive to train, limiting the flexibility of RNN models which can be trained. Reversible RNNs-RNNs for which the hidden-to-hidden transition can be reversed-offer a path to reduce the memory requirements of training, as hidden states need not be stored and instead can be recomputed during backpropagation. We first show that perfectly reversible RNNs, which require no storage of the hidden activ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…Reversible CNNs have been applied to several traditional image tasks such as compression [46], reconstruction [43], retrieval [42], and denoising [33,47] as well as to compressed sensing [61], compact resolution [75], image to image translation [67], remote sensing [56], medical image segmentation [55,74] and MRI reconstruction [57]. Reversible transformation have also been adapted to other networks such as RNNs [51], Unet [4,16], Masked Convolutional Networks [60] and 1000-layer deep Graph Neural Networks [40]. Some early attempts have also been made to adapt the reversible transformation to the NLP domain, initiated by Kiatev et al [38] and built upon in [78,79] for machine translation.…”
Section: Related Workmentioning
confidence: 99%
“…Reversible CNNs have been applied to several traditional image tasks such as compression [46], reconstruction [43], retrieval [42], and denoising [33,47] as well as to compressed sensing [61], compact resolution [75], image to image translation [67], remote sensing [56], medical image segmentation [55,74] and MRI reconstruction [57]. Reversible transformation have also been adapted to other networks such as RNNs [51], Unet [4,16], Masked Convolutional Networks [60] and 1000-layer deep Graph Neural Networks [40]. Some early attempts have also been made to adapt the reversible transformation to the NLP domain, initiated by Kiatev et al [38] and built upon in [78,79] for machine translation.…”
Section: Related Workmentioning
confidence: 99%