2022
DOI: 10.1007/s10589-022-00409-4
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid limited memory gradient projection methods for box-constrained optimization problems

Abstract: Gradient projection methods represent effective tools for solving large-scale constrained optimization problems thanks to their simple implementation and low computational cost per iteration. Despite these good properties, a slow convergence rate can affect gradient projection schemes, especially when high accurate solutions are needed. A strategy to mitigate this drawback consists in properly selecting the values for the steplength along the negative gradient. In this paper, we consider the class of gradient … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 42 publications
0
3
0
Order By: Relevance
“…The BB iterations were modified and investigated in a number of publications [65][66][67][68][69][70][71][72][73][74][75][76][77][78][79]. The so-called Scalar Correction (SC) method from [80] proposed the trial steplength in (17) defined by…”
Section: Improved Gradient Descent Methods As Motivationmentioning
confidence: 99%
“…The BB iterations were modified and investigated in a number of publications [65][66][67][68][69][70][71][72][73][74][75][76][77][78][79]. The so-called Scalar Correction (SC) method from [80] proposed the trial steplength in (17) defined by…”
Section: Improved Gradient Descent Methods As Motivationmentioning
confidence: 99%
“…Moreover, our approach to the selection of the steplength α k is in the same line of [29,30], where the authors shows that the effectiveness of line-search based FB methods of the form (2) can be significantly improved by devising clever steplength selection rules, in spite of lower theoretical rate estimates.…”
Section: Related Workmentioning
confidence: 99%
“…According to the founding in [29,30], we then propose to generalize the implementation of Phila described in the previous section with the subgradient (45) in place of the gradient of f .…”
Section: Non-negatively Constrained Quadratic Problemmentioning
confidence: 99%
“…On the same line is the paper by Crisci et al [7] that aims to make effective gradient projection methods for constrained optimization problems. Moving from an approach originally proposed by Fletcher [8] for the unconstrained case, in [7] the authors propose a limited memory strategy based on the Ritz-like values of the Hessian matrix restricted to the inactive constraints, when the final active set is reached. Computational experiments show the nice behaviuor of their approach on large scale problems for which classical gradient descent methods usually yield very slow convergence.…”
mentioning
confidence: 94%