2017
DOI: 10.1088/1361-6420/aa7ac7
|View full text |Cite
|
Sign up to set email alerts
|

Convergence analysis of a two-point gradient method for nonlinear ill-posed problems

Abstract: We perform a convergence analysis of a two-point gradient method which is based on Landweber iteration and on Nesterov's acceleration scheme. Additionally, we show the usefulness of this method via two numerical example problems based on a nonlinear Hammerstein operator and on the nonlinear inverse problem of single photon emission computed tomography.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
79
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 49 publications
(82 citation statements)
references
References 23 publications
3
79
0
Order By: Relevance
“…In this paper we will consider (3.3) with λ δ n satisfying suitable conditions to be specified later. Note that our method (3.3) requires the use of the previous two iterations at every iteration step, which follows the spirit from [14]; on the other hand, our method allows the use of a general p-convex penalty function Θ, which could be non-smooth, to reconstruct solutions with special features such as sparsity and discontinuities.…”
Section: The Two-point Gradient Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…In this paper we will consider (3.3) with λ δ n satisfying suitable conditions to be specified later. Note that our method (3.3) requires the use of the previous two iterations at every iteration step, which follows the spirit from [14]; on the other hand, our method allows the use of a general p-convex penalty function Θ, which could be non-smooth, to reconstruct solutions with special features such as sparsity and discontinuities.…”
Section: The Two-point Gradient Methodsmentioning
confidence: 99%
“…Therefore, it is necessary to find out other strategy for generating λ δ n such that (3.15) and (3.27) hold. We will adapt the discrete backtracking search (DBTS) algorithm introduced in [14] to our situation. To this end, we take a function q : N → N that is non-increasing and ∞ i=0 q(i) < ∞.…”
Section: )mentioning
confidence: 99%
See 2 more Smart Citations
“…In recent years, there has been increasing evidence to show that the second order iterative methods exhibit remarkable acceleration properties for stably solving ill-posed problems. The most well-known methods are the Nesterov acceleration scheme (Neubauer (2017)), the ν-method (Engl et al, 1996, § 6.3), and the two-point gradient method (Hubmer & Ramlau (2017)). Recently, the authors in Zhang & Hofmann (2018) have established an initial theory of the second order asymptotical regularization method with fixed damping parameter for solving general linear ill-posed inverse problems.…”
Section: Of 30mentioning
confidence: 99%