2022
DOI: 10.1007/s10107-022-01846-z
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive stochastic sequential quadratic programming with differentiable exact augmented lagrangians

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
49
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 21 publications
(52 citation statements)
references
References 41 publications
3
49
0
Order By: Relevance
“…Recently, a class of stochastic SQP methods has been developed for solving (1.1). These methods outperform stochastic penalty methods empirically and have convergence guarantees in expectation [7,28]. In [7], the authors propose an objective-function-free stochastic SQP method with adaptive step sizes for the fully stochastic regime.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Recently, a class of stochastic SQP methods has been developed for solving (1.1). These methods outperform stochastic penalty methods empirically and have convergence guarantees in expectation [7,28]. In [7], the authors propose an objective-function-free stochastic SQP method with adaptive step sizes for the fully stochastic regime.…”
Section: Introductionmentioning
confidence: 99%
“…In [7], the authors propose an objective-function-free stochastic SQP method with adaptive step sizes for the fully stochastic regime. In contrast, in [28], the authors propose a stochastic step search (referred to as line search in the paper [28]) SQP method for the setting in which the errors in the function and derivative approximations can be diminished. We note that several algorithm choices in the two papers [7,28], e.g., merit functions and merit parameters, are different.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…One of the future works is to apply our Hessian averaging technique on constrained nonlinear optimization problems. [45,46] designed various stochastic second-order methods based on sequential quadratic programming (SQP) for solving constrained problems, where the Hessian of the Lagrangian function was estimated by subsampling. These works established the global convergence for stochastic SQP methods, while the local convergence rate of these methods remains unknown.…”
Section: Discussionmentioning
confidence: 99%
“…Non-convex minimization problems with non-convex constraints have been recently studied due to their popularity in the machine learning community [16,14]. [57,58] developed stochastic algorithms based on sequential quadratic programming that incorporate a stochastic line search and converge globally to a first-order stationary point of the optimization program. [18] and [1] used a Lagrangian-based approach with access to an optimization oracle and found a distribution over solutions rather than a pure equilibrium.…”
Section: Related Workmentioning
confidence: 99%