2015
DOI: 10.1080/02331934.2015.1004549
|View full text |Cite
|
Sign up to set email alerts
|

A strongly convergent proximal bundle method for convex minimization in Hilbert spaces

Abstract: A key procedure in proximal bundle methods for convex minimization problems is the definition of stability centres, which are points generated by the iterative process that successfully decrease the objective function. In this paper we study a different stability-centre classification rule for proximal bundle methods. We show that the proposed bundle variant has at least two particularly interesting features: (i) the sequence of stability centres generated by the method converges strongly to the solution that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 31 publications
0
11
0
Order By: Relevance
“…A proximal bundle algorithm can stop with a satisfactory solution trueg^l when both e k and ∥ q k ∥ are small (van Ackooij, Cruz, & de Oliveira, 2016), where ek=normalℒfrakturBkfalse(gk+1false)+<qk,trueg^lgk+1>false(gtrue^lfalse)=δktrueg^lgk+12/tk is the aggregate linearization error and qk=gk+1trueg^ltk. We define the optimality gap as (best upper bound‐best lower bound)/best upper bound, that is, false(UBfalse(gtrue^l+1false)false)/UB.…”
Section: The Proximal Bundle Methodsmentioning
confidence: 99%
“…A proximal bundle algorithm can stop with a satisfactory solution trueg^l when both e k and ∥ q k ∥ are small (van Ackooij, Cruz, & de Oliveira, 2016), where ek=normalℒfrakturBkfalse(gk+1false)+<qk,trueg^lgk+1>false(gtrue^lfalse)=δktrueg^lgk+12/tk is the aggregate linearization error and qk=gk+1trueg^ltk. We define the optimality gap as (best upper bound‐best lower bound)/best upper bound, that is, false(UBfalse(gtrue^l+1false)false)/UB.…”
Section: The Proximal Bundle Methodsmentioning
confidence: 99%
“…As already mentioned, in this paper we are interested in weak and strong convergence of projected gradient methods applied to convex programs as (1). To the best of our knowledge, weak convergence of the projected gradient method has only been shown under the assumption of Lipschitz continuity of ∇f or using exogenous stepsize, like Strategy (d) above.…”
Section: Projected Gradient Methods Initializationmentioning
confidence: 99%
“…We now recall some necessary and sufficient optimality conditions for problem (1), whose proof can be found in [7,Prop. 17.4].…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…There is a lot of prior work related to the method proposed in this paper. The work on variable metric bundle methods [9,18,25,36,37,[40][41][42]45,49,50,54,61,63,66,70] and (inexact) proximal Newton-type methods [8,11,12,27,38,39,43,44,51,59,60,62,71] are probably the most closely related, though our method differs in key ways arising from our assumed access methods. Proximal Newton methods are similar in philosophy to our approach, as both types of methods really shine when the structured part of the objective can be minimized efficiently.…”
Section: Related Workmentioning
confidence: 99%