2015
DOI: 10.1090/mcom/3025
|View full text |Cite
|
Sign up to set email alerts
|

Inexact Restoration approach for minimization with inexact evaluation of the objective function

Abstract: A new method is introduced for minimizing a function that can be computed only inexactly, with different levels of accuracy. The challenge is to evaluate the (potentially very expensive) objective function with low accuracy as far as this does not interfere with the goal of getting high accuracy minimization at the end. For achieving this goal the problem is reformulated in terms of constrained optimization and handled with an Inexact Restoration technique. Convergence is proved and numerical experiments motiv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
24
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 22 publications
(26 citation statements)
references
References 31 publications
2
24
0
Order By: Relevance
“…By (31), (35), and (40) we have that f , h , and g are bounded above by C f , C h , and C g respectively. Then, as ρ + 1 = 1 θ , we obtain that…”
Section: Convergence To Feasibility and Convergence Of Stepsizementioning
confidence: 99%
See 2 more Smart Citations
“…By (31), (35), and (40) we have that f , h , and g are bounded above by C f , C h , and C g respectively. Then, as ρ + 1 = 1 θ , we obtain that…”
Section: Convergence To Feasibility and Convergence Of Stepsizementioning
confidence: 99%
“…The idea of using the IR framework to deal with optimization problems in which the objective function is subject to evaluation errors comes from [40], where inexactness came from the fact that the evaluation was the result of an iterative process. Evaluating the function with additional precision was considered in [40] as a sort of inexact restoration. This basic principle was developed in [11] and [12], where complexity results were also proved.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This motivates the derivation of methods that approximate the function and/or the gradient and even the Hessian through a subsampling. This topic has been widely studied recently, see for example [6][7][8]14,20,21,23]. In these papers the data-fitting problem involves a sum, over a large number of measurements, of the misfits.…”
Section: Introductionmentioning
confidence: 99%
“…In these papers the data-fitting problem involves a sum, over a large number of measurements, of the misfits. In [7,8,20,23] exact and inexact Newton methods and line-search methods based on approximations of the gradient and the Hessian obtained through subsampling are considered, in [21]the problem is reformulated in terms of constrained optimization and handled with an Inexact Restoration technique. In [14] the stochastic gradient method is applied to the approximated problems and conditions on the size of the subproblems are given to maintain the rate of convergence of the full gradient method.…”
Section: Introductionmentioning
confidence: 99%