2020
DOI: 10.1080/01630563.2020.1740734
|View full text |Cite
|
Sign up to set email alerts
|

A Data-Driven Iteratively Regularized Landweber Iteration

Abstract: We derive and analyse a new variant of the iteratively regularized Landweber iteration, for solving linear and nonlinear ill-posed inverse problems. The method takes into account training data, which are used to estimate the interior of a black box, which is used to define the iteration process. We prove convergence and stability for the scheme in infinite dimensional Hilbert spaces. These theoretical results are complemented by several numerical experiments for solving linear inverse problems for the Radon tr… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 25 publications
0
14
0
Order By: Relevance
“…The (possibly non-linear) operator B : X → Y can now be represented by a deep neural network that can be trained against supervised data by comparing the final iterate in (6.6) (iterates are stopped following the Morozov discrepancy principle (4.1)) against the ground truth for given data. Convergence and stability for the scheme in (6.6) in infinite-dimensional Hilbert spaces is proved by Aspri et al (2018). This theoretical results are complemented by several numerical experiments for solving linear inverse problems for the Radon transform and a non-linear inverse problem of Schlieren tomography.…”
Section: Learned Landwebermentioning
confidence: 85%
See 3 more Smart Citations
“…The (possibly non-linear) operator B : X → Y can now be represented by a deep neural network that can be trained against supervised data by comparing the final iterate in (6.6) (iterates are stopped following the Morozov discrepancy principle (4.1)) against the ground truth for given data. Convergence and stability for the scheme in (6.6) in infinite-dimensional Hilbert spaces is proved by Aspri et al (2018). This theoretical results are complemented by several numerical experiments for solving linear inverse problems for the Radon transform and a non-linear inverse problem of Schlieren tomography.…”
Section: Learned Landwebermentioning
confidence: 85%
“…Convergence and stability for the scheme in (6.6) in infinite-dimensional Hilbert spaces is proved by Aspri et al. (2018). This theoretical results are complemented by several numerical experiments for solving linear inverse problems for the Radon transform and a non-linear inverse problem of Schlieren tomography.…”
Section: Special Topicsmentioning
confidence: 98%
See 2 more Smart Citations
“…Nowadays, with the rise of the area of big data, methods that combine forward modeling with data driven techniques are being developed [7]. Some of these techniques build upon the similarity between deep neural networks and classical approaches to inverse problems such as iterative regularization [8,9] and proximal methods [10]. Some are based on postprocessing of the reconstructions obtained by a simple inversion technique such as filtered backprojection [11].…”
Section: Introductionmentioning
confidence: 99%