2020
DOI: 10.1007/s10589-020-00201-2
|View full text |Cite
|
Sign up to set email alerts
|

A regularization method for constrained nonlinear least squares

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 43 publications
0
7
0
Order By: Relevance
“…The main advantage of a Julia implementation over our original AMPL implementation is that we may take full advantage of our Julia software suite for optimization, hosted under the JuliaSmoothOptimizers (JSO) organization [22]. Our suite provides a general consistent API for solvers to interact with models by providing flexible data types to represent the objective and constraint functions, to evaluate their derivatives, to examine bounds on the variables, to add slack variables transparently, and to provide essentially any information that a solver might request from a model.…”
Section: Key Featuresmentioning
confidence: 99%
See 3 more Smart Citations
“…The main advantage of a Julia implementation over our original AMPL implementation is that we may take full advantage of our Julia software suite for optimization, hosted under the JuliaSmoothOptimizers (JSO) organization [22]. Our suite provides a general consistent API for solvers to interact with models by providing flexible data types to represent the objective and constraint functions, to evaluate their derivatives, to examine bounds on the variables, to add slack variables transparently, and to provide essentially any information that a solver might request from a model.…”
Section: Key Featuresmentioning
confidence: 99%
“…• NLPModels [24] is the main modeling package that defines the API on which solvers can rely to interact with models. Models are represented as instances of a data type deriving from the base type AbstractNLPModel, and solvers can evaluate the objective value by calling the obj() method, the gradient vector by calling the grad() method, and so forth.…”
Section: Key Featuresmentioning
confidence: 99%
See 2 more Smart Citations
“…In particular, the work of Izmailov et al [15] is concerned with local convergence properties of Tikhonov-type regularization algorithms, which requires the use of second derivatives. Meanwhile, other regularization techniques for constrained least squares that do not directly belong to the Levenberg-Marquardt class of methods have recently been proposed, that are also based on the SQP methodology (see [17,22] and references therein). However, to the best of our knowledge, these approaches are not based on nonmonotone rules, and generally require second-order derivatives.…”
mentioning
confidence: 99%