2017
DOI: 10.1007/s11590-017-1195-9
|View full text |Cite
|
Sign up to set email alerts
|

A parameterized proximal point algorithm for separable convex optimization

Abstract: In this paper, we develop a Parameterized Proximal Point Algorithm (P-PPA) for solving a class of separable convex programming problems subject to linear and convex constraints. The proposed algorithm is provable to be globally convergent with a worst-case O(1/t) convergence rate, where t denotes the iteration number. By properly choosing the algorithm parameters, numerical experiments on solving a sparse optimization problem arising from statistical learning show that our P-PPA could perform significantly bet… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
36
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 29 publications
(36 citation statements)
references
References 29 publications
0
36
0
Order By: Relevance
“…Similar to the convergence proof in e.g. [1] and the proof of Lemma 3.1, the global convergence and sublinear convergence rate of Algorithm 2.1 can be easily established as as below, whose proof is omitted here for the sake of conciseness. Theorem 3.1 Let (ρ, r, s) satisfy (5) and {w k } be generated by Algorithm 2.1.…”
Section: Convergence Analysismentioning
confidence: 98%
See 2 more Smart Citations
“…Similar to the convergence proof in e.g. [1] and the proof of Lemma 3.1, the global convergence and sublinear convergence rate of Algorithm 2.1 can be easily established as as below, whose proof is omitted here for the sake of conciseness. Theorem 3.1 Let (ρ, r, s) satisfy (5) and {w k } be generated by Algorithm 2.1.…”
Section: Convergence Analysismentioning
confidence: 98%
“…The augmented Lagrangian method (ALM), independently proposed by Hestenes [7] and Powell [11], is a benchmark method for solving problem (1). Its iteration scheme reads as…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Proof See the proof of Theorem 1 [1]. In order to establish the convergence rate of Algorithm 2.1 in an ergodic sense, we first need to characterize the solution set of VI(φ, J , Ω), which had been given by e.g.…”
Section: 24)mentioning
confidence: 99%
“…Recently, Cai et al [2] designed a PPA with a relaxation step for the model (1.1) with p = 2, whose global convergence and the worst-case sub-linear convergence rate were analyzed in detail. More recently, by introducing some parameters to the metric proximal matrix, an extended parameterized PPA based on [15] was developed for the two block separable convex programming [1], whose effectiveness and robustness was demonstrated by testing a sparse vector optimization problem in the statistical learning compared with two popular algorithms.…”
Section: Introductionmentioning
confidence: 99%