2020
DOI: 10.1109/tci.2020.2990299
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Scale Learned Iterative Reconstruction

Abstract: Model-based learned iterative reconstruction methods have recently been shown to outperform classical reconstruction methods. Applicability of these methods to large scale inverse problems is however limited by the available memory for training and extensive training times. As a possible solution to these restrictions we propose a multi-scale learned iterative reconstruction algorithm that computes iterates on discretisations of increasing resolution. This procedure does not only reduce memory requirements, it… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

3
6

Authors

Journals

citations
Cited by 36 publications
(28 citation statements)
references
References 42 publications
0
28
0
Order By: Relevance
“…In particular, we are motivated by model-based learned iterative reconstruction techniques that have shown to be highly successful in a variety of application areas [1,2,17,21,35]. These methods generally mimic iterative gradient descent schemes and demonstrate impressive reconstruction results with often considerable speed ups [18], but are mostly empirically motivated and lack convergence guarantees. In contrast, this paper follows a recent development of understanding how deep learning methods can be combined with classical reconstruction algorithms, such as variational techniques, and retaining established theoretical results on convergence.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, we are motivated by model-based learned iterative reconstruction techniques that have shown to be highly successful in a variety of application areas [1,2,17,21,35]. These methods generally mimic iterative gradient descent schemes and demonstrate impressive reconstruction results with often considerable speed ups [18], but are mostly empirically motivated and lack convergence guarantees. In contrast, this paper follows a recent development of understanding how deep learning methods can be combined with classical reconstruction algorithms, such as variational techniques, and retaining established theoretical results on convergence.…”
Section: Introductionmentioning
confidence: 99%
“…where J(σ) is the Jacobian of the simulated voltages U (σ) (e.g., computed by [34], [35]). In Newton's method, the Hessian is computed exactly according to (11). In the Gauss-Newton (GN) method, the second term is ignored due to the costly computation of the second order derivative U i (σ).…”
Section: B the Inverse Problem For Eitmentioning
confidence: 99%
“…Another idea of how to scale learned iterative schemes to 3D is by computing the forward model on multiple lower resolutions in the reconstruction process. 171…”
Section: D Nature Of Patmentioning
confidence: 99%