2013
DOI: 10.1007/s10589-013-9563-6
|View full text |Cite
|
Sign up to set email alerts
|

Preconditioning Newton–Krylov methods in nonconvex large scale optimization

Abstract: We consider an iterative preconditioning technique for non-convex large scale optimization. First, we refer to the solution of large scale indefinite linear systems by using a Krylov subspace method, and describe the iterative construction of a preconditioner which does not involve matrices products or matrices storage. The set of directions generated by the Krylov subspace method is used, as by product, to provide an approximate inverse preconditioner. Then, we experience our preconditioner within Truncated N… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3

Relationship

5
2

Authors

Journals

citations
Cited by 25 publications
(45 citation statements)
references
References 28 publications
0
45
0
Order By: Relevance
“…In [12], besides some theoretical properties, the results of an extensive numerical experience is reported, showing that the use of such preconditioners makes PNCG algorithms more efficient and robust than the unpreconditioned ones, on most CUTEst [18] large scale problems. These preconditioners are also inspired by some recent proposals in the context of Newton-Krylov methods (see [14] [15]), along with some effective preconditioning techniques from the literature of preconditioners for symmetric linear systems, namely the Limited Memory Preconditioners [19].…”
Section: The Class Of Preconditionersmentioning
confidence: 99%
“…In [12], besides some theoretical properties, the results of an extensive numerical experience is reported, showing that the use of such preconditioners makes PNCG algorithms more efficient and robust than the unpreconditioned ones, on most CUTEst [18] large scale problems. These preconditioners are also inspired by some recent proposals in the context of Newton-Krylov methods (see [14] [15]), along with some effective preconditioning techniques from the literature of preconditioners for symmetric linear systems, namely the Limited Memory Preconditioners [19].…”
Section: The Class Of Preconditionersmentioning
confidence: 99%
“…In addition, we highlight that item (i) in the previous proposition does not imply also that the point (x k , x 0k ) T , with y k = x k /x 0k , is in the asymptotic cone of F γ . Indeed (i) in general implies the situation depicted in Figure 11 (left), where (p k , 0) T ∈ C ∞ , but the point y k does not satisfy equation (18). Figure 11 (right) gives a graphical representation of item (ii) of Proposition 8.3.…”
Section: Cg Iterations and Cg Failure: A Geometric Viewpointmentioning
confidence: 97%
“…Indeed, the results of Lemma 7.1 suggest that in Cartesian coordinates the polar hyperplane of a point not only includes first order information (i.e. the gradient of the function at the current iterate), but also information on the 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 Of course, since possibly y k in Table 1 does not satisfy equation (18), in general in Cartesian coordinates the line y k + λp k , λ ∈ R, does not include the centre y * of F. Unfortunately, the CG is unable to compute a finite steplength along p k (see also Figure 11), so that it stops prematurely. Nevertheless, an ad hoc inexact linesearch procedure along p k could be conceived.…”
Section: Perspectivesmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly, matrix-free preconditioning for linear systems or sequences of linear systems is currently an appealing research topic, too (see e.g. [4,5,13,14] ).…”
Section: Introductionmentioning
confidence: 99%