2020
DOI: 10.1080/10556788.2020.1725751
|View full text |Cite
|
Sign up to set email alerts
|

An investigation of Newton-Sketch and subsampled Newton methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
58
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 65 publications
(59 citation statements)
references
References 12 publications
1
58
0
Order By: Relevance
“…When the overlap set O k is not too small, y k is a useful approximation of the curvature of the objective function along the most recent displacement, and leads to a productive quasi-Newton step. This observation is based on an important property of Newton-like methods, namely that there is much more freedom in choosing a Hessian approximation than in computing the gradient [3,8,16,45]. More specifically, a smaller sample O k can be employed for updating the inverse Hessian approximation H k , than for computing the batch gradient g Sk k used to define the search direction −H k g Sk k .…”
Section: A Multi-batch Quasi-newton Methodsmentioning
confidence: 99%
“…When the overlap set O k is not too small, y k is a useful approximation of the curvature of the objective function along the most recent displacement, and leads to a productive quasi-Newton step. This observation is based on an important property of Newton-like methods, namely that there is much more freedom in choosing a Hessian approximation than in computing the gradient [3,8,16,45]. More specifically, a smaller sample O k can be employed for updating the inverse Hessian approximation H k , than for computing the batch gradient g Sk k used to define the search direction −H k g Sk k .…”
Section: A Multi-batch Quasi-newton Methodsmentioning
confidence: 99%
“…Finally, we would like to mention that in [5] the authors perform a local complexity analysis of subsampled Inexact Newton methods and also show that methods that in-corporate stochastic second-order information can be far more efficient on badly-scaled or ill-conditioned problems than first-order methods.…”
Section: Related Workmentioning
confidence: 99%
“…Our analysis is strictly related to that developed in [5,7,36,37], where convergence of Inexact subsampled Newton methods is investigated both in probability [36,37] and expectation [9,7,5]. We differ from these papers as we focus on the choice of the forcing terms, on the nonmonotone line search strategy and on adaptive choices of Hessian sample size.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A fruitful line of research has focused on how to improve the asymptotic convergence rate as t → ∞ through preconditioning: a technique that involves approximating the unknown Hessian H.θ/ = ∇ 2 θ L.θ/ (see, for instance, Bordes et al (2009) and references therein). Utilizing the curvature information that is reflected by various efficient approximations of the Hessian matrix, stochastic quasi-Newton methods (Moritz et al, 2016;Byrd et al, 2016;Wang et al, 2017;Schraudolph et al, 2007;Mokhtari and Ribeiro, 2015;Becker and Fadili, 2012), Newton sketching or subsampled Newton methods (Pilanci and Wainwright, 2015;Xu et al, 2016;Berahas et al, 2017;Bollapragada et al, 2016) and stochastic approximation of the inverse Hessian via Taylor series expansion (Agarwal et al, 2017) have been proposed to strike a balance between convergence rate and per-iteration complexity.…”
Section: Relationships To the Literaturementioning
confidence: 99%