2013
DOI: 10.1016/j.camwa.2013.05.004
|View full text |Cite
|
Sign up to set email alerts
|

Nested splitting conjugate gradient method for matrix equationAXB=Cand preconditioning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…In this paper, we present the nested splitting conjugate gradient (NSCG) iterative method for solving the generalized Sylvester equation AXB + CXD = E with large sparse coefficient matrices, which is an extension of [16][17][18]. Based on the NSCG method, we propose the preconditioned nested splitting conjugate gradient (PNSCG) iterative method, which extends the scope of the NSCG iteration method in applications.…”
Section: Discussionmentioning
confidence: 98%
See 2 more Smart Citations
“…In this paper, we present the nested splitting conjugate gradient (NSCG) iterative method for solving the generalized Sylvester equation AXB + CXD = E with large sparse coefficient matrices, which is an extension of [16][17][18]. Based on the NSCG method, we propose the preconditioned nested splitting conjugate gradient (PNSCG) iterative method, which extends the scope of the NSCG iteration method in applications.…”
Section: Discussionmentioning
confidence: 98%
“…In this section, we will give an implementation of the NSCG method for the generalized Sylvester equation (1.1) and present sufficient conditions for the convergent splitting when the coefficient matrices are nonsymmetric, which extend those results in [16][17][18].…”
Section: The Nscg Methodsmentioning
confidence: 96%
See 1 more Smart Citation
“…For Example 1, we compare ADMM with the Hermitian and skew‐Hermitian splitting (HSS) iteration method , the gradient‐based iteration (GI) method and the nested splitting conjugate gradient (NSCG) method . For the HSS method, we choose ε k =0.01, η k =0.01 when γ =0, and ε k =0.1, η k =0.1 when γ =1.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…presented gradient based iteration methods as well as least squares based iteration methods for the linear matrix equation , which were extensions of the Jacobi and Gauss‐Seidel iterations for A x = b . In , a class of nested splitting conjugate gradient (NSCG) methods were proposed, which were extensions of the NSCG method for A x = b .…”
Section: Introductionmentioning
confidence: 99%