2018
DOI: 10.1088/1361-6420/aacebe
|View full text |Cite
|
Sign up to set email alerts
|

Nesterov’s accelerated gradient method for nonlinear ill-posed problems with a locally convex residual functional

Abstract: In this paper, we consider Nesterov's Accelerated Gradient method for solving Nonlinear Inverse and Ill-Posed Problems. Known to be a fast gradient-based iterative method for solving well-posed convex optimization problems, this method also leads to promising results for ill-posed problems. Here, we provide a convergence analysis for ill-posed problems of this method based on the assumption of a locally convex residual functional. Furthermore, we demonstrate the usefulness of the method on a number of numerica… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 32 publications
0
23
0
Order By: Relevance
“…Landweber iteration is known to be notoriously slow. However, with the ingredients above, also state-of the art accelerated versions of Landweber iteration, such as the steepest descent or the minimal error method [41] or Nesterov iteration and versions thereof [21,39] can be implemented in a straightforward manner.…”
Section: Tikhonov Regularisationmentioning
confidence: 99%
See 2 more Smart Citations
“…Landweber iteration is known to be notoriously slow. However, with the ingredients above, also state-of the art accelerated versions of Landweber iteration, such as the steepest descent or the minimal error method [41] or Nesterov iteration and versions thereof [21,39] can be implemented in a straightforward manner.…”
Section: Tikhonov Regularisationmentioning
confidence: 99%
“…≤γ . Thus, the assumptions of Lemma 2.1 are satisfied and we can make use of estimates (20), (21) with f = r + 2κp 2 t cf. (18).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…For the Nesterov acceleration (1), a detailed analysis has been performed by Neubauer [15] with the result that, assuming a usual source condition (12) and an appropriate a priori stopping rule, the resulting iterative regularization scheme is of optimal order for μ 1 2 , and, for μ > 1 2 , the convergence rates improve with μ but in a suboptimal way. More precisely, the convergence rates proven in [15] are…”
Section: Convergence Rates and Semi-saturationmentioning
confidence: 99%
“…In the realm of ill-posed problems, Hubmer and Ramlau [12] performed a convergence analysis for the nonlinear case, and showed the efficiency of the method.…”
Section: Introductionmentioning
confidence: 99%