2021
DOI: 10.48550/arxiv.2105.03939
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Lightweight Image Super-Resolution with Hierarchical and Differentiable Neural Architecture Search

Abstract: Single Image Super-Resolution (SISR) tasks have achieved significant performance with deep neural networks. However, the large number of parameters in CNNbased methods for SISR tasks require heavy computations. Although several efficient SISR models have been recently proposed, most are handcrafted and thus lack flexibility. In this work, we propose a novel differentiable Neural Architecture Search (NAS) approach on both the cell-level and network-level to search for lightweight SISR models. Specifically, the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 45 publications
0
6
0
Order By: Relevance
“…FERN [26] inserts group convolution into wide activation convolution [27,28] module to boost performance while maintaining fewer parameters and faster execution speed. DLSR [29] proposes the MRB structure,which is composed of a mixed layer, a residual connection, and a ReLU layer.The mixed layer is made up of multiple operations including normal convolution, separable convolution and dilated convolution [30]. PAN [31] employs the self-calibrated convolution scheme in their networks for efficient SR, which has fewest parameters among all participants in AIM 2020.…”
Section: Related Work 21 Cnn-based Single Image Super-resolutionmentioning
confidence: 99%
“…FERN [26] inserts group convolution into wide activation convolution [27,28] module to boost performance while maintaining fewer parameters and faster execution speed. DLSR [29] proposes the MRB structure,which is composed of a mixed layer, a residual connection, and a ReLU layer.The mixed layer is made up of multiple operations including normal convolution, separable convolution and dilated convolution [30]. PAN [31] employs the self-calibrated convolution scheme in their networks for efficient SR, which has fewest parameters among all participants in AIM 2020.…”
Section: Related Work 21 Cnn-based Single Image Super-resolutionmentioning
confidence: 99%
“…RFDN [ 36 ] applied residual feature distillation blocks, which are a variant of IMDB and are more powerful and flexible. DLSR [ 37 ] introduced a differentiable neural architecture search method to find more powerful fusion blocks based on RFDB.…”
Section: Related Workmentioning
confidence: 99%
“…In detail, the shallow residual block in [51] is replaced by the proposed EDBB to construct the EFDB. Different from IMDN and RFDN utilizing global distillation connections to process input features progressively, neural architecture search (NAS) [30] is adopted to decide the feature connection paths. The searched structure is shown in the orange dashed box.…”
Section: Nku-esrmentioning
confidence: 99%