2011
DOI: 10.1007/s11081-011-9178-7
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-only approaches to avoid spurious local minima in unconstrained optimization

Abstract: We reflect on some theoretical aspects of gradient-only optimization for the unconstrained optimization of objective functions containing nonphysical step or jump discontinuities. This kind of discontinuity arises when the optimization problem is based on the solutions of systems of partial differential equations, in combination with variable discretization techniques (e.g. remeshing in spatial domains, and / or variable time stepping in temporal domains). These discontinuities, which may cause local minima, a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
25
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(25 citation statements)
references
References 18 publications
0
25
0
Order By: Relevance
“…However, the associated derivative [18] is defined everywhere and it is given by the left hand limit for lower semicontinuous functions and by the right hand limit for upper semi-continuous functions at the discontinuity. The benefit of the complex-step derivative is that it computes the associated derivative and allows the computation of sensitivity information even at a discontinuity.…”
Section: Sensitivities Of Discontinuous Functionsmentioning
confidence: 99%
See 4 more Smart Citations
“…However, the associated derivative [18] is defined everywhere and it is given by the left hand limit for lower semicontinuous functions and by the right hand limit for upper semi-continuous functions at the discontinuity. The benefit of the complex-step derivative is that it computes the associated derivative and allows the computation of sensitivity information even at a discontinuity.…”
Section: Sensitivities Of Discontinuous Functionsmentioning
confidence: 99%
“…Similarly, derivative information can also always be computed when using (semi)-analytical or automatic differentiation strategies. The computed sensitivities are the associated derivative [18] that can be used for gradient-only optimization.…”
Section: Sensitivities Of Discontinuous Functionsmentioning
confidence: 99%
See 3 more Smart Citations