2019
DOI: 10.3934/jimo.2018122
|View full text |Cite
|
Sign up to set email alerts
|

Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization

Abstract: Memoryless quasi-Newton methods are studied for solving largescale unconstrained optimization problems. Recently, memoryless quasi-Newton methods based on several kinds of updating formulas were proposed. Since the methods closely related to the conjugate gradient method, the methods are promising. In this paper, we propose a memoryless quasi-Newton method based on the Broyden family with the spectral-scaling secant condition. We focus on the convex and preconvex classes of the Broyden family, and we show that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…In this paper, we only dealt with the memoryless BFGS formula for scaled proximal mappings. However, following previous research [21,20], we expect that the proposed algorithm based on the memoryless Broyden family with preconvex class ((25) with φ k < 0) may perform better than that with the memoryless BFGS formula. In deed, ( 25) can be rewritten as…”
Section: Discussionmentioning
confidence: 71%
“…In this paper, we only dealt with the memoryless BFGS formula for scaled proximal mappings. However, following previous research [21,20], we expect that the proposed algorithm based on the memoryless Broyden family with preconvex class ((25) with φ k < 0) may perform better than that with the memoryless BFGS formula. In deed, ( 25) can be rewritten as…”
Section: Discussionmentioning
confidence: 71%
“…Becker et al [5] gave a theorem similar to Theorem 2.2 that can be applied to the memoryless BFGS formula. However, L becomes more complicated than (23) (in this case, L is a two-dimensional function) and the exact solution of L(α) = 0 cannot be computed even if h(x) = λ∥x∥ 1 . In addition, in this case, L is not necessarily strongly monotone and its merit functions are possibly nondifferentiable.…”
Section: Algorithm 1 (Proximal Memoryless Sr1 Method)mentioning
confidence: 99%
“…Memoryless quasi-Newton methods are efficient especially for solving large-scale problems because the methods have lower memory requirements and use matrix-free computation. For that reason, there has been significant development of these methods (see, [19,20,22,23] and references therein). Since the SR1 method is an efficient method, some researchers have extended the method to memoryless SR1 methods [19,22].…”
mentioning
confidence: 99%
“…Shanno [27] proposed a memoryless quasi-Newton method as a way to deal with this problem. This method [9,[12][13][14][15] has proven effective at solving large-scale unconstrained optimization problems. The concept is simple: an approximate matrix is updated by using the identity matrix instead of the previous approximate matrix.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, he showed that the search direction is a sufficient descent direction and has the global convergence property. Nakayama, Narushima, and Yabe [15] proposed memoryless quasi-Newton methods based on the spectral-scaling Broyden family [3]. Their methods generate a sufficient descent direction and have the global convergence property.…”
Section: Introductionmentioning
confidence: 99%