2018
DOI: 10.1007/s11075-018-0565-4
|View full text |Cite
|
Sign up to set email alerts
|

Convergence rates of accelerated proximal gradient algorithms under independent noise

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…Direct use of forward-backward splitting method to (11) yields the following scheme Xk+1=scriptSγλ(Xkγ(QXkQB+μXk)), where γ is the stepsize. It has been well-known that the forward-backward splitting can be accelerated by the Nesterov technique which is frequently used in sparse or low-rank signal processing [26,27,28,29,30,31,32]. By introducing an auxiliary positive sequence (tk)k0 with t0=1 and (Yk)k0, the accelerated scheme can be described as: for any given X0, set Y0=X0, t0=1,…”
Section: Problem Formulation and The Solutionmentioning
confidence: 99%
“…Direct use of forward-backward splitting method to (11) yields the following scheme Xk+1=scriptSγλ(Xkγ(QXkQB+μXk)), where γ is the stepsize. It has been well-known that the forward-backward splitting can be accelerated by the Nesterov technique which is frequently used in sparse or low-rank signal processing [26,27,28,29,30,31,32]. By introducing an auxiliary positive sequence (tk)k0 with t0=1 and (Yk)k0, the accelerated scheme can be described as: for any given X0, set Y0=X0, t0=1,…”
Section: Problem Formulation and The Solutionmentioning
confidence: 99%
“…where h is the stepsize, prox is the proximal operator and e k is the noise. In the convex case, this algorithm is discussed in [38,29], and the acceleration is studied in [31].…”
Section: Inexact Nonconvex Proximal Gradient Algorithmmentioning
confidence: 99%