2021
DOI: 10.48550/arxiv.2106.10520
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SAN: Stochastic Average Newton Algorithm for Minimizing Finite Sums

Abstract: We present a principled approach for designing stochastic Newton methods for solving finite sum optimization problems. Our approach has two steps. First, we rewrite the stationarity conditions as a system of nonlinear equations that associates each data point to a new row. Second, we apply a subsampled Newton Raphson method to solve this system of nonlinear equations. By design, methods developed using our approach are incremental, in that they require only a single data point per iteration. Using our approach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Practical experiments have shown that SAG requires many parameter fine-tuning to perform perfectly. Some other variants of SAG are optimization of a finite sum of smooth convex functions (Schmidt et al, 2017) and its second-order version named Stochastic Average Newton (SAN) (Chen et al, 2021).…”
Section: Stochastic Average Gradientmentioning
confidence: 99%
“…Practical experiments have shown that SAG requires many parameter fine-tuning to perform perfectly. Some other variants of SAG are optimization of a finite sum of smooth convex functions (Schmidt et al, 2017) and its second-order version named Stochastic Average Newton (SAN) (Chen et al, 2021).…”
Section: Stochastic Average Gradientmentioning
confidence: 99%