2020
DOI: 10.1109/tmm.2020.2973862
|View full text |Cite
|
Sign up to set email alerts
|

Learning Non-Locally Regularized Compressed Sensing Network With Half-Quadratic Splitting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 48 publications
(23 citation statements)
references
References 35 publications
0
23
0
Order By: Relevance
“…Our proposed ISTA-Net ++ is compared with several representative state-of-the-art methods including BM3D-AMP [18], LDAMP [19], DIP [20], ReconNet [28], DPDNN [14], GDN [15], ISTA-Net + [2], NLR-CSNet [21], DPA-Net [10], and MAC-Net [9]. Different from other deep network-based end-to-end methods, for ISTA-Net ++ , we use the five sampling matrices for five CS ratios to train our model only once.…”
Section: Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…Our proposed ISTA-Net ++ is compared with several representative state-of-the-art methods including BM3D-AMP [18], LDAMP [19], DIP [20], ReconNet [28], DPDNN [14], GDN [15], ISTA-Net + [2], NLR-CSNet [21], DPA-Net [10], and MAC-Net [9]. Different from other deep network-based end-to-end methods, for ISTA-Net ++ , we use the five sampling matrices for five CS ratios to train our model only once.…”
Section: Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…Specifically, considering the optimizationbased methods, three representative CS schemes are selected, i.e., TV [53], MH [32] and GSR [31]. In view of the deep network-based methods, more than ten CS algorithms are considered, including: SDA [33], ReconNet [16], LDIT [20], LDAMP [20], DPDNN [63], I-Recon [17], ISTA-Net [21], DR 2 -Net [15], IRCNN [42], NN [64], NLR-CSNet [54], DPA-Net [37] and DPIR [44]. b) For the learned sampling matrixbased CS methods, seven recent literatures, i.e., CSNet [55], LapCSNet [35], SCSNet [19], CSNet + [6], BCS-Net [39], OPINE-Net + [23] and AMP-Net [22] participate in the comparison in our experiments.…”
Section: Comparisons With State-of-the-art Methodsmentioning
confidence: 99%
“…3) The proposed non-local module globally explores the self-similar knowledge in the entire image space, which significantly expands the receptive field. Recently, inspired by Deep Image Prior (DIP) [56], Sun et al [54] propose a non-locally regularized CS network, in which the non-local prior and the deep network prior are both considered for image reconstruction. However, for each image, the network in [54] needs to be trained online driven by the two priors in an iterative fashion, which undoubtedly brings about a high computational cost and a lack of flexibility.…”
Section: B Non-local Self-similarity Image Priormentioning
confidence: 99%
See 1 more Smart Citation
“…Network-Driven methods: deep networks have made gratifying progress in visionrelated tasks [15,20,21]. With the help of the excellent representation ability of deep network, some scholars apply it to compressive sensing reconstruction, forming a network-driven reconstruction method.…”
Section: Related Workmentioning
confidence: 99%