2005
DOI: 10.1007/s10957-005-2093-3
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Gradient Methods for Linearly Constrained Optimization

Abstract: Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is defined for the case in which no simple bounds are present. This algorithm can be viewed as a quasiNewton method in which the approximate Hessians satisfy a weak secant equation. The spectral choice of steplength is embedded into the Hessian approximation, and the whole process is combined with a nonmonotone line search strategy. The simple bounds are then taken … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0
1

Year Published

2008
2008
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 17 publications
0
12
0
1
Order By: Relevance
“…The case in which the convex feasible set of the problem to be solved by SPG is defined by linear equality and inequality constraints has been considered in Birgin et al (2003b), Andreani et al (2005), Martínez, Pilotta, and Raydan (2005), and . A crucial observation is that this type of set is not necessarily one in which it is easy to project.…”
Section: Applications and Extensionsmentioning
confidence: 99%
“…The case in which the convex feasible set of the problem to be solved by SPG is defined by linear equality and inequality constraints has been considered in Birgin et al (2003b), Andreani et al (2005), Martínez, Pilotta, and Raydan (2005), and . A crucial observation is that this type of set is not necessarily one in which it is easy to project.…”
Section: Applications and Extensionsmentioning
confidence: 99%
“…In [38] the SPG method was used to solve Augmented Lagrangian subproblems. The spectral gradient method solves the subproblems originated by the application of an exponential penalty method to linearly constrained optimization in [56].…”
Section: Further Developmentsmentioning
confidence: 99%
“…assume that (0,1)   is given independently of the iteration number k and that k d has been computed . and goto step 2 for more detail see [10].…”
Section: Non-monotone Line Search Algorithmmentioning
confidence: 99%