2020
DOI: 10.1080/10556788.2020.1727476
|View full text |Cite
|
Sign up to set email alerts
|

Gradient methods exploiting spectral properties

Abstract: We propose a new stepsize for the gradient method. It is shown that this new stepsize will converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang's asymptotic optimal gradient method (Computational Optimization and Applications, 2006, 33(1): 73-88) is applied for minimizing quadratic objective functions. Based on this spectral property, we develop a monotone gradient method that takes a certain number of steps using the asymptotically optimal stepsize by Dai and Yang, and then foll… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 40 publications
(123 reference statements)
0
13
0
Order By: Relevance
“…However, these methods are based on the SD method, i.e., occasionally applying short steps during the iterates of the SD method. One exception is given by [21], where a method is developed by employing new stepsizes during the iterates of the AOPT method. But our method periodically uses three different stepsizes: the BB stepsize, stepsize (11) and the new stepsize αk .…”
Section: Remarkmentioning
confidence: 99%
See 2 more Smart Citations
“…However, these methods are based on the SD method, i.e., occasionally applying short steps during the iterates of the SD method. One exception is given by [21], where a method is developed by employing new stepsizes during the iterates of the AOPT method. But our method periodically uses three different stepsizes: the BB stepsize, stepsize (11) and the new stepsize αk .…”
Section: Remarkmentioning
confidence: 99%
“…We have tested five different sets of problems in the form of (92) with spectrum distributions described in Table 2, see [21,30] for example. The rand function in Matlab was used for generating Hessian of the first three problem sets.…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…numerical solution methods have been intensively studied in the literature, including the spectral gradient methods [5,15], conjugate gradient methods [4,13] and memoryless BFGS methods [16]. Among them, conjugate gradient methods are popular and efficient for solving (1), especially for large scale problems.…”
Section: Introductionmentioning
confidence: 99%
“…Yu et al [35] combined the trust-region scheme and BB stepsizes with SARAH for solving nonsmooth convex composite problems. A remarkable advantage of the stepsize given by the BB method is that it estimates a scalar approximation of the Hessian and is not sensitive to the choice of initial stepsizes, see [5,7,11] and references therein for more details about BB-like methods. However, the research on incorporating BB stepsizes with proximal stochastic gradient methods in the nonconvex nonsmooth case is far less than in the convex case.…”
mentioning
confidence: 99%