2013
DOI: 10.1007/s10107-013-0720-6
|View full text |Cite
|
Sign up to set email alerts
|

The divergence of the BFGS and Gauss Newton methods

Abstract: We present examples of divergence for the BFGS and Gauss Newton methods. These examples have objective functions with bounded level sets and other properties concerning the examples published recently in this journal, like unit steps and convexity along the search lines. As these other examples, the iterates, function values and gradients in the new examples fit into the general formulation in our previous work On the divergence of line search methods, Comput. Appl. Math. vol.26 no.1 (2007), which also presen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 26 publications
(17 citation statements)
references
References 19 publications
0
16
0
1
Order By: Relevance
“…Moore-Penrose Quasi-Newton methods attempt to build an approximation of the Hessian matrix (or its inverse) that incorporates second order information by incorporating first order information as the optimisation proceeds. The Broyden-Fletcher-Goldfarb-Shanno Algorithm (BFGS) [49,126,136,144] is one of the most famous quasi-Newton algorithms for unconstrained optimization. Moving away from deterministic algorithms, intelligence-oriented algorithms (Genetic algorithms, Swarm algorithms) [97,173,179,180] with their simplicity are another way to search the solution of extreme problems with many local minima.…”
Section: Least Squares Methods (Lsm)mentioning
confidence: 99%
See 1 more Smart Citation
“…Moore-Penrose Quasi-Newton methods attempt to build an approximation of the Hessian matrix (or its inverse) that incorporates second order information by incorporating first order information as the optimisation proceeds. The Broyden-Fletcher-Goldfarb-Shanno Algorithm (BFGS) [49,126,136,144] is one of the most famous quasi-Newton algorithms for unconstrained optimization. Moving away from deterministic algorithms, intelligence-oriented algorithms (Genetic algorithms, Swarm algorithms) [97,173,179,180] with their simplicity are another way to search the solution of extreme problems with many local minima.…”
Section: Least Squares Methods (Lsm)mentioning
confidence: 99%
“…It automatically selects the most efficient algorithm for the computed mathematical problem. Matlab uses several algorithms depending on the type of problem to be solved: interior reflective Newton [17][18] [ 49,126,136,144] method, trust-region-dogleg, trust-region-reflective, Levenberg-Marquardt, simplex, BFGS, MiniMax, and so on.…”
Section: Least Squares Methods (Lsm)mentioning
confidence: 99%
“…When coupled to our PnL algorithm, these two optimizations are referred as VPnL GN. Because of the non-linear nature of the functions f and g, the Gauss-Newton algorithm does not guarantee convergence to the global minimum with a random initialization [20]. However, convergence to optimal parameters can be achieved with an initialization close to the global minimum.…”
Section: Gauss-newton Optimization For the Rotation And The Translmentioning
confidence: 99%
“…Then the optimal solution x a * , which naturally coincides with the actual location x a , can be achieved by using the iterative Gauss-Netwon method [26] with an arbitrary initial point…”
Section: Problem 1 (Tx Location Optimzation)mentioning
confidence: 99%
“…where (N r ≥ 4) to detect the TV-SV system clock difference, according to the solution of (23) in [26].…”
Section: Problem 1 (Tx Location Optimzation)mentioning
confidence: 99%