2017
DOI: 10.1088/1361-6420/33/4/044001
|View full text |Cite
|
Sign up to set email alerts
|

A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors

Abstract: Many problems in science and engineering involve, as part of their solution process, the consideration of a separable function which is the sum of two convex functions, one of them possibly non-smooth. Recently a few works have discussed inexact versions of several accelerated proximal methods aiming at solving this minimization problem. This paper shows that inexact versions of a method of Beck and Teboulle (FISTA) preserve, in a Hilbert space setting, the same (non-asymptotic) rate of convergence under some … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
19
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 91 publications
0
19
0
Order By: Relevance
“…We believe that TEPROG, as well as the "telescopic" idea (regarding the telescopic sequence), hold a promising potential to be applied in other theoretical and practical scenarios. It will be interesting to test TEPROG numerically and to compare it with other methods, and also to check whether suitable inexact versions of TEPROG, namely ones which allow errors to appear during the iterative process, exhibit similar convergence properties (in this connection, see [37], [38] and the references therein).…”
Section: Discussionmentioning
confidence: 99%
“…We believe that TEPROG, as well as the "telescopic" idea (regarding the telescopic sequence), hold a promising potential to be applied in other theoretical and practical scenarios. It will be interesting to test TEPROG numerically and to compare it with other methods, and also to check whether suitable inexact versions of TEPROG, namely ones which allow errors to appear during the iterative process, exhibit similar convergence properties (in this connection, see [37], [38] and the references therein).…”
Section: Discussionmentioning
confidence: 99%
“…The state of current research on superiorization can best be appreciated from the "Superiorization and Perturbation Resilience of Algorithms: A Bibliography compiled and continuously updated by Yair Censor" [22]. In addition, [49], [21] and [71,Section 4] are recent reviews of interest.…”
Section: Superiorizationmentioning
confidence: 99%
“…This methodology is heuristic and its goal is to find certain good, or superior, solutions to optimization problems. More precisely, suppose that we want to solve a certain optimization problem, for example, minimization of a convex function under constraints (below we focus on this optimization problem because it is relevant to our paper; for an approach which considers the superiorization methodology in a much broader form, see [71,Section 4]). Often, solving the full problem can be rather demanding from the computational point of view, but solving part of it, say the feasibility part (namely, finding a point which satisfies all the constraints) is, in many cases, less demanding.…”
Section: Superiorizationmentioning
confidence: 99%
“…These concepts are rigorously defined in several recent works in the field, we refer the reader to the recent reviews [13], [5] and references therein. More material about the current state of superiorization can be found also in [6], [14] and [19].…”
Section: Introduction: the General Concept Of Superiorizationmentioning
confidence: 99%