2020
DOI: 10.1109/tbc.2020.2977513
|View full text |Cite
|
Sign up to set email alerts
|

Lightweight Super-Resolution Using Deep Neural Learning

Abstract: There is a gap between recent development of 4K display technologies and the short storage of 4K contents. Super-Resolution (SR) serves as a bridge to harmonize the need and demand. Recently, Convolutional Neural Network (CNN) based networks have demonstrated great property in image SR. However, most existing methods require large model capacity and consume expensive computation for high performance. Besides, most methods keep the upscaling part relatively simple compared with the feature extraction part. For … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(6 citation statements)
references
References 42 publications
0
6
0
Order By: Relevance
“…We undertake experiments with the BI degradation model and compare our AFAN families with existing advanced networks: SRCNN [9], FSRCNN [4], VDSR [21], DRCN [22], LapSRN [24], DRRN [47], SelNet [6], Mem-Net [48], SRMDNF [63], IDN [18], CARN [1], CARN-M [1], SRFBN-S [30], CBPN [69], CBPN-S [69], AWSRN-M [53], OISR-RK2-s [13], MoreMNAS-A [8], A2F-S [54], LESRCNN [50], SPBP-L [2], RMUN [20], FALSR-A [7], FALSR-B [7], FALSR-C [7], WMRN [45], LMAN-s [52], MADNet-L1 [25], MSWSR [61], Cross-SRN [34], ACNet [49], CRMBN [56], DRSAN-48s [41], and DRSAN-48m [41]. It is worth mentioning that we make comparisons depending on the similarity magnitudes of the network parameters, whereby an same scale factor is partitioned into multiple groups.…”
Section: Results With Bi Degradationmentioning
confidence: 99%
“…We undertake experiments with the BI degradation model and compare our AFAN families with existing advanced networks: SRCNN [9], FSRCNN [4], VDSR [21], DRCN [22], LapSRN [24], DRRN [47], SelNet [6], Mem-Net [48], SRMDNF [63], IDN [18], CARN [1], CARN-M [1], SRFBN-S [30], CBPN [69], CBPN-S [69], AWSRN-M [53], OISR-RK2-s [13], MoreMNAS-A [8], A2F-S [54], LESRCNN [50], SPBP-L [2], RMUN [20], FALSR-A [7], FALSR-B [7], FALSR-C [7], WMRN [45], LMAN-s [52], MADNet-L1 [25], MSWSR [61], Cross-SRN [34], ACNet [49], CRMBN [56], DRSAN-48s [41], and DRSAN-48m [41]. It is worth mentioning that we make comparisons depending on the similarity magnitudes of the network parameters, whereby an same scale factor is partitioned into multiple groups.…”
Section: Results With Bi Degradationmentioning
confidence: 99%
“…We compared the proposed DATN with existing SR networks on the BI degradation model: SRCNN, 8 FSRCNN, 3 VDSR, 9 DRCN, 14 LapSRN, 40 DRRN, 15 SelNet, 41 MemNet, 42 SRMDNF, 43 IDN, 17 CARN, 19 CARN-M, 19 SRFBN-S, 44 CBPN, 45 CBPN-S, 45 AWSRN-M, 46 OISR-RK2-s, 47 A2F-S, 21 LESRCNN, 20 SPBP-L, 48 RMUN, 49 FALSR-A, 16 FALSR-B, 16 FALSR-C, 16 WMRN, 50 LMAN-s, 51 MADNet-L1, 52 MSWSR, 13 Cross-SRN, 53 ACNet, 54 CRMBN, 55 DRSAN-48m, 56 AFAN, 57 ESRT, 58 and LBNet 59 …”
Section: Methodsmentioning
confidence: 99%
“…The proposed MRAN is compared with several recent SR methods on ×2, ×4 and ×8 scales, including Bicubic, DRRN [4], LapSRN [7], MS-LapSRN [8], MGBP [9], RMUN [13].…”
Section: Comparison With the State-of-the-art Methodsmentioning
confidence: 99%
“…Recently, Michelini et.al [9] also adopted the progressive upsampling SR framework and achieved relatively high results. Some other works, such as EDSR [10], RCAN [11], SAN [12], RMUN [13] and so on, have been constantly proposed and significantly improved the performance in terms of SISR.…”
Section: Introductionmentioning
confidence: 99%