1976
DOI: 10.1137/0314056
|View full text |Cite
|
Sign up to set email alerts
|

Monotone Operators and the Proximal Point Algorithm

Abstract: We consider a Newton-CG augmented Lagrangian method for solving semidefinite programming (SDP) problems from the perspective of approximate semismooth Newton methods. In order to analyze the rate of convergence of our proposed method, we characterize the Lipschitz continuity of the corresponding solution mapping at the origin. For the inner problems, we show that the positive definiteness of the generalized Hessian of the objective function in these inner problems, a key property for ensuring the efficiency of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

18
2,255
0
24

Year Published

1998
1998
2011
2011

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 3,129 publications
(2,297 citation statements)
references
References 22 publications
18
2,255
0
24
Order By: Relevance
“…For example, Rockafellar [33] observed that the classical proximal point method converges finitely on a polyhedral function; the same holds for functions with the "weak sharp minimum" property introduced by Ferris [12].…”
Section: Introductionmentioning
confidence: 93%
“…For example, Rockafellar [33] observed that the classical proximal point method converges finitely on a polyhedral function; the same holds for functions with the "weak sharp minimum" property introduced by Ferris [12].…”
Section: Introductionmentioning
confidence: 93%
“…[28]). The conditions under which the GPPM is known to converge strongly (see [52], [35], [7], [17], [20] and the references therein) are quite restrictive and mostly concern the data of ( ) [in contrast to those ensuring weak convergence which mostly concern the Bregman function whose selection can be done from a relatively large pool of known candidates -cf. [18]].…”
Section: Convergence and Stability Of A Regularization Methods For Maxmentioning
confidence: 99%
“…4.1 A question of interest in convex optimization concerns the strong convergence of the generalized proximal point method (GPPM for short) which emerged from the works of Martinet [43], [44], Rockafellar [52] and Censor and Zenios [21]. When applied to the consistent problem ( ) described in Subsection 3.1 the GPPM produces iterates according to the rule P { } …”
Section: Regularization Of a Proximal Point Methodsmentioning
confidence: 99%
See 2 more Smart Citations