2022
DOI: 10.48550/arxiv.2201.11643
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective

H. Attouch,
J. Fadili

Abstract: We revisit the Ravine method of Gelfand and Tsetlin from a dynamical system perspective, study its convergence properties, and highlight its similarities and differences with the Nesterov accelerated gradient method. The two methods are closely related. They can be deduced from each other by reversing the order of the extrapolation and gradient operations in their definitions. They benefit from similar fast convergence of values and convergence of iterates for general convex objective functions. We will also e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…Let us consider Case 3 when (1) reduces to finding y ∈ R p such that 0 ∈ Ay + By . By Lemma 2.1, y ∈ zer(A + B) if and only if G λQ y = 0, where Q := A + B and G λQ is defined by (2). Moreover, G λQ is λ(4−λL)…”
Section: Application To Forward-backward Splitting Methodsmentioning
confidence: 97%
See 3 more Smart Citations
“…Let us consider Case 3 when (1) reduces to finding y ∈ R p such that 0 ∈ Ay + By . By Lemma 2.1, y ∈ zer(A + B) if and only if G λQ y = 0, where Q := A + B and G λQ is defined by (2). Moreover, G λQ is λ(4−λL)…”
Section: Application To Forward-backward Splitting Methodsmentioning
confidence: 97%
“…In fact, (12) covers the proximal-point scheme in [24] as a special case. As discussed in [2], ( 12) can be viewed as Ravine's method if convergence is given in y k instead of x k .…”
Section: The Relation Between Halpern's and Nesterov's Accelerationsmentioning
confidence: 99%
See 2 more Smart Citations
“…which was introduced by Su, Boyd and Candès in [48]. (AVD) α is a low resolution ODE of the accelerated gradient method of Nesterov [41,42] and of the Ravine method [19], [48]. (AVD) α has been the subject of many recent studies which have given an in-depth understanding of the Nesterov accelerated gradient method, see [5], [7], [9], [12], [24], [36], [40], [48], [46], [52].…”
Section: The Role Of the Tikhonov Regularizationmentioning
confidence: 99%