2023
DOI: 10.1016/j.cam.2023.115190
|View full text |Cite
|
Sign up to set email alerts
|

The regularized stochastic Nesterov’s accelerated Quasi-Newton method with applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…Large-scale optimization through the sampled versions of quasi-Newton method was considered in [14,15]. Also, the convergence rate of randomized and greedy variants of Newtonian methods and quasi-Newton methods were presented in [16][17][18][19][20][21][22][23][24].…”
Section: Introductionmentioning
confidence: 99%
“…Large-scale optimization through the sampled versions of quasi-Newton method was considered in [14,15]. Also, the convergence rate of randomized and greedy variants of Newtonian methods and quasi-Newton methods were presented in [16][17][18][19][20][21][22][23][24].…”
Section: Introductionmentioning
confidence: 99%
“…An extension of BFGS proposed in [49] generates an estimate of the true objective function by taking the empirical mean over a sample drawn at each step and attains R-superlinear convergence. A regularized stochastic accelerated QN method (RES-NAQ) that combines the concept of the regularized stochastic BFGS method (RES) with the Nesterov accelerating technique by introducing a new momentum coefficient was proposed in [50].…”
Section: Introductionmentioning
confidence: 99%