2019
DOI: 10.1016/j.amc.2019.03.039
|View full text |Cite
|
Sign up to set email alerts
|

Steplength selection in gradient projection methods for box-constrained quadratic programs

Abstract: The role of the steplength selection strategies in gradient methods has been widely investigated in the last decades. Starting from the work of Barzilai and Borwein (1988), many efficient steplength rules have been designed, that contributed to make the gradient approaches an effective tool for the large-scale optimization problems arising in important real-world applications. Most of these steplength rules have been thought in unconstrained optimization, with the aim of exploiting some second-order informatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(34 citation statements)
references
References 37 publications
0
34
0
Order By: Relevance
“…The non-restrictive hypothesis on α k allows to select it by means of strategies known in the literature to accelerate the performance of standard first order methods. In particular, we mention the well known Barzilai-Borwein rules proposed in the seminal paper [42], which gave rise to a variety of further studies (see for example [43,44,45,46,47,48]), to the more recent variants and adaptations [49,50]. Here we adopt a variant of the rules proposed in [50], which takes into account both the presences of constraints and of a nontrivial scaling matrix multiplying the gradient.…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 2 more Smart Citations
“…The non-restrictive hypothesis on α k allows to select it by means of strategies known in the literature to accelerate the performance of standard first order methods. In particular, we mention the well known Barzilai-Borwein rules proposed in the seminal paper [42], which gave rise to a variety of further studies (see for example [43,44,45,46,47,48]), to the more recent variants and adaptations [49,50]. Here we adopt a variant of the rules proposed in [50], which takes into account both the presences of constraints and of a nontrivial scaling matrix multiplying the gradient.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…Then, the value of α k is computed by alternating the two values as described in Algorithm ABB min , where α min and α max , with 0 < α min ≤ α max are thresholding constants. More details about stepsize selection can be found in [49,50].…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to remove blurring artifacts from the images of each dataset, we applied a well-known deblurring technique based on the minimization of the Kullback-Leibler divergence with a smooth total-variation regularization term referred to as hypersurface potential ; this minimization is performed by means of the Scaled Gradient Projection Page 10 of 20 AUTHOR SUBMITTED MANUSCRIPT -IP-102361 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 Table 1: Ground truth values of the kinetic parameters k 1 , k 2 , k 3 , k 4 , (min −1 ) and the blood volume fraction V , for each one of the four homogeneous regions. (SGP) method proposed in [44] (see also [45]), starting from the inverse Radon transform of the noisy sinogram data. The deblurring procedure exploits the parallel toolbox of Matlab enabling the use of GPUarray and it requires about 7 minutes overall.…”
Section: Setup Of the Algorithmsmentioning
confidence: 99%
“…We note that (5) forces a sufficient decrease in the objective function, while the curvature condition (6) prevents the method from taking too small steps, which is not guaranteed by condition (5) alone. This drawback can be avoided by choosing α k with a suitable backtracking procedure [25, page 37].…”
Section: Introductionmentioning
confidence: 99%