2020
DOI: 10.21123/bsj.2020.17.3(suppl.).0994
|View full text |Cite
|
Sign up to set email alerts
|

Modified BFGS Update (H-Version) Based on the Determinant Property of Inverse of Hessian Matrix for Unconstrained Optimization

Abstract: The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…A new family of updated BFGS updates was proposed in [9] to resolve the unconstrained issue of optimization for non-convex functions based on a new modified weak Wolfe-Powell line search technique. In [10], the authors proposed the modified BFGS update (H-version) by updating the vector s ( next solution-current solution) and they provided that the proposed method preserve the strong positive definite property and globally convergent. The method of Newton ( ( ( is efficient because it uses the Hessian matrix that provides the useful curvature data, however the computational efforts of the Hessian matrix are very costly or it is difficult to test the Hessian matrix even though the Hessian matrix is not analytically usable this refers to a class of techniques that use only the values of the equation and the gradients of the objective function which are closely related to the method of Newton.…”
Section: Mahmood and Eidimentioning
confidence: 99%
See 1 more Smart Citation
“…A new family of updated BFGS updates was proposed in [9] to resolve the unconstrained issue of optimization for non-convex functions based on a new modified weak Wolfe-Powell line search technique. In [10], the authors proposed the modified BFGS update (H-version) by updating the vector s ( next solution-current solution) and they provided that the proposed method preserve the strong positive definite property and globally convergent. The method of Newton ( ( ( is efficient because it uses the Hessian matrix that provides the useful curvature data, however the computational efforts of the Hessian matrix are very costly or it is difficult to test the Hessian matrix even though the Hessian matrix is not analytically usable this refers to a class of techniques that use only the values of the equation and the gradients of the objective function which are closely related to the method of Newton.…”
Section: Mahmood and Eidimentioning
confidence: 99%
“…( 1 ) The objective function f is assumed to be twice continuous and differentiable in a convex open set D, moreover f is uniformly convex.The most efficient quasi-Newton method is the (BFGS( method, which is introduced independently by Broyden, Fletcher, Goldfarb, and Shannoin in [10]. Note that the Hessian approximation B k+1 can be modified in the BFGS approach and ( ) ( )…”
Section: Mahmood and Eidimentioning
confidence: 99%
“…and in [6] introduce the modified quasi-Newton for Modified Broyden Fletcher Goldfarb shanno such that :…”
Section: Modified Broyden Classmentioning
confidence: 99%
“…Since the Broyden Fletcher Goldfarb shanno (BFGS) is positive definite if and only if 𝕤 k 𝕪 k T > 0 there for in [6] interdicted Modified Broyden Fletcher Goldfarb shanno (MBFGS) update which is positive definite without conditions and the (Hversion) of this method from the source [6] is :…”
Section: Introductionmentioning
confidence: 99%
“…[6] modified the Powell Symmetric Broyden (MPSB ) update to guarantee the PD property of Hessian matrix where the original update does not guarantee this property. [7] Proposed the MBFGS update by updating the vector s (next solution-current solution) and provided the preserve of the strong PD property and globally convergent. This article introduces a new methodology for solving the unconstrained optimization issue by combining the Marquardt method and the MQ-N method.…”
Section: Introductionmentioning
confidence: 99%