2021
DOI: 10.1155/2021/8342536
|View full text |Cite
|
Sign up to set email alerts
|

The Global Convergence of a Modified BFGS Method under Inexact Line Search for Nonconvex Functions

Abstract: Among the quasi-Newton algorithms, the BFGS method is often discussed by related scholars. However, in the case of inexact Wolfe line searches or even exact line search, the global convergence of the BFGS method for nonconvex functions is not still proven. Based on the aforementioned issues, we propose a new quasi-Newton algorithm to obtain a better convergence property; it is designed according to the following essentials: (1) a modified BFGS formula is designed to guarantee that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 41 publications
0
3
0
Order By: Relevance
“…Therefore, a modification term f ( x 1 , x 2 , M c ) related to the component mass ratios and cross-linking density of the PDMS network is introduced on the logarithmic mixing rule to account for the influence of forced intertwining and incompatibility of components The BFGS quasi-Newton algorithm is adopted to determine the function f , considered to be the most numerically effective quasi-Newton method with global convergence and superlinear convergence speed. The algorithm avoids the calculation of a complex Hessian matrix in the Newton method, and it is widely used in function optimization of various application scenarios. , The specific calculation steps of the BFGS quasi-Newton algorithm can be referred to in the Supporting Information (Figure S2). Due to the limited number of available experimental data, we only use the first 21 data points in Table S1 in the Supporting Information to train the appropriate model by the BFGS quasi-Newton algorithm, and reserve the last 12 data points for a completely independent test of the predictive capability of the revised model.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, a modification term f ( x 1 , x 2 , M c ) related to the component mass ratios and cross-linking density of the PDMS network is introduced on the logarithmic mixing rule to account for the influence of forced intertwining and incompatibility of components The BFGS quasi-Newton algorithm is adopted to determine the function f , considered to be the most numerically effective quasi-Newton method with global convergence and superlinear convergence speed. The algorithm avoids the calculation of a complex Hessian matrix in the Newton method, and it is widely used in function optimization of various application scenarios. , The specific calculation steps of the BFGS quasi-Newton algorithm can be referred to in the Supporting Information (Figure S2). Due to the limited number of available experimental data, we only use the first 21 data points in Table S1 in the Supporting Information to train the appropriate model by the BFGS quasi-Newton algorithm, and reserve the last 12 data points for a completely independent test of the predictive capability of the revised model.…”
Section: Resultsmentioning
confidence: 99%
“…The algorithm avoids the calculation of a complex Hessian matrix in the Newton method, and it is widely used in function optimization of various application scenarios. 42,43 The specific calculation steps of the BFGS quasi-Newton algorithm can be referred to in the Supporting Information (Figure S2). Due to the limited number of available experimental data, we only use the first 21 data points in Table S1 in the Supporting Information to train the appropriate model by the BFGS quasi-Newton algorithm, and reserve the last 12 data points for a completely independent test of the predictive capability of the revised model.…”
Section: Density the Effect Of Different Molar Ratios Of [Nco]/[oh] O...mentioning
confidence: 99%
“…Moreover, the modified BFGS algorithm performs better than BFGS for some of the test problems ( Pengyuan Li, Junyu Lu and Haishan Feng (2021)). MBFGS has many applications in machine learning, deep Learning, artificial intelligence (AI) and data science.…”
Section: Introductionmentioning
confidence: 97%