2019
DOI: 10.1515/jiip-2018-0093
|View full text |Cite
|
Sign up to set email alerts
|

The Ivanov regularized Gauss–Newton method in Banach space with an a posteriori choice of the regularization radius

Abstract: In this paper we consider the iteratively regularized Gauss-Newton method, where regularization is achieved by Ivanov regularization, i.e., by imposing a priori constraints on the solution. We propose an a posteriori choice of the regularization radius, based on an inexact Newton / discrepancy principle approach, prove convergence and convergence rates under a variational source condition as the noise level tends to zero, and provide an analysis of the discretization error. Our results are valid in general, po… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 31 publications
0
5
0
Order By: Relevance
“…(which for = 0 follows from convexity of J , i.e., monotonicity of ∇J ) and assuming approximate stationarity Using (18), (19), we get from (17), that for all k ≤ k * − 1 with k * defined by the estimate…”
Section: A Projected Gradient Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…(which for = 0 follows from convexity of J , i.e., monotonicity of ∇J ) and assuming approximate stationarity Using (18), (19), we get from (17), that for all k ≤ k * − 1 with k * defined by the estimate…”
Section: A Projected Gradient Methodsmentioning
confidence: 99%
“…Alternatively, we can further estimate (17) under a condition following from ( 18), (19) and comprising both convexity and approximate stationarity which for k ≤ k * − 1 implies as well as Using ( 23)-( 24) we get from ( 17)…”
Section: A Projected Gradient Methodsmentioning
confidence: 99%
See 3 more Smart Citations