2015
DOI: 10.48550/arxiv.1511.06085
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Variable Rate Image Compression with Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
184
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 109 publications
(185 citation statements)
references
References 0 publications
1
184
0
Order By: Relevance
“…Fig. 4 (b) and (c) show that the SICS is superior or competitive to traditional methods in terms of the reconstruction performance, which is consistent with the opinion of most DNNbased image compression methods [10], [11], [12]. In essence, it is the better quality of reconstruction images that leads to better classification performance.…”
Section: B Comparison Experimentssupporting
confidence: 76%
See 1 more Smart Citation
“…Fig. 4 (b) and (c) show that the SICS is superior or competitive to traditional methods in terms of the reconstruction performance, which is consistent with the opinion of most DNNbased image compression methods [10], [11], [12]. In essence, it is the better quality of reconstruction images that leads to better classification performance.…”
Section: B Comparison Experimentssupporting
confidence: 76%
“…So 𝑑𝑧 𝑘𝑖 𝑗 𝑑𝑒 𝑘𝑖 𝑗 = 1. Toderici et al [12] used a stochastic binarization function as 𝑧 𝑘𝑖 𝑗 = −1 when 𝑒 𝑘𝑖 𝑗 < 0, and 𝑧 𝑘𝑖 𝑗 = 1 otherwise. Here we point out that eithr using a proxy function, or adding an uniform noise, their derivatives of corresponding rounding functions are both 1 in the back propagation.…”
Section: A System Architecturementioning
confidence: 99%
“…The most representative works can be divided into two branches: recurrent neural network (RNN) based models and convolutional neural network (CNN) based models. RNN based models [16,34,35] compress images or residual information from the previous step iteratively, while CNN based models typically transform images into compact latent representations for further entropy coding. Some early works [1,4,33] solve the problem of non-differential quantization and rate estimation.…”
Section: Image Compressionmentioning
confidence: 99%
“…In the early stage, [34] uses RNN structure to compress raw and residual information iteratively. After each iteration, the bitstreams will increase, and the quality of the reconstructed image will be enhanced.…”
Section: Learned Scalable Image Compressionmentioning
confidence: 99%
See 1 more Smart Citation