2014
DOI: 10.1093/imanum/drt060
|View full text |Cite
|
Sign up to set email alerts
|

A splitting method for separable convex programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
103
0
5

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 90 publications
(108 citation statements)
references
References 26 publications
0
103
0
5
Order By: Relevance
“…We compare it with the splitting method for separable convex programming (denoted by HTY) in [30], the full Jacobian decomposition of the augmented Lagrangian method (denoted by FJDALM) in [13], and the fully parallel ADMM (denoted by PADMM) in [14]. Through these numerical experiments, we want to illustrate that (1) suitable proximal terms can enhance PFPSM's numerical efficiency in practice, (2) larger values of the Glowinski relaxation factor can often accelerate PFPSM's convergence speed, and (3) compared with the other three ADMM-type methods the dynamically updated step size defined in (27) can accelerate the convergence of PFPSM.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…We compare it with the splitting method for separable convex programming (denoted by HTY) in [30], the full Jacobian decomposition of the augmented Lagrangian method (denoted by FJDALM) in [13], and the fully parallel ADMM (denoted by PADMM) in [14]. Through these numerical experiments, we want to illustrate that (1) suitable proximal terms can enhance PFPSM's numerical efficiency in practice, (2) larger values of the Glowinski relaxation factor can often accelerate PFPSM's convergence speed, and (3) compared with the other three ADMM-type methods the dynamically updated step size defined in (27) can accelerate the convergence of PFPSM.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Therefore, how to simplify or even remove the potentially expensive correction step is very important in improving the numerical performance of relevant splitting algorithms. Most recently, He et al [30] were so inspired to propose a novel splitting method in the combination of the Gauss-Seidel and the Jacobian decompositions, and extensive numerical results further verified the efficiency of the proposed method. In this paper, we study along this line and propose a new partially parallel splitting method, which follows the similar decomposition philosophy of [30], but with a much simplified iterative scheme.…”
Section: Introductionmentioning
confidence: 86%
“…Most recently, He et al [30] were so inspired to propose a novel splitting method in the combination of the Gauss-Seidel and the Jacobian decompositions, and extensive numerical results further verified the efficiency of the proposed method. In this paper, we study along this line and propose a new partially parallel splitting method, which follows the similar decomposition philosophy of [30], but with a much simplified iterative scheme. Moreover, our approach can be accelerated by a relaxation step, and, as a consequence, it is faster than the method in [30] (see Sect.…”
Section: Introductionmentioning
confidence: 86%
“…In recent years, some novel splitting methods tailored for iterative scheme (1.1) are well established in the context of prediction-correction method, and they can be accordingly grouped into two categories. The first one is Gauss-Seideltype splitting method, which uses a correction step to update the output (predictor) generated by iterative scheme (1.2), e.g., see [33][34][35]. The second one is Jacobiantype splitting method, which decomposes the quadratic term in a Jacobian way such that the resulting subproblems can be solved simultaneously.…”
Section: Augmented-lagrangian-based Splitting Methodsmentioning
confidence: 99%