2019
DOI: 10.1007/s10107-019-01404-0
|View full text |Cite
|
Sign up to set email alerts
|

A linearly convergent doubly stochastic Gauss–Seidel algorithm for solving linear equations and a certain class of over-parameterized optimization problems

Abstract: Consider the classical problem of solving a general linear system of equations Ax = b. It is well known that the (successively over relaxed) Gauss-Seidel scheme and many of its variants may not converge when A is neither diagonally dominant nor symmetric positive definite. Can we have a linearly convergent G-S type algorithm that works for any A? In this paper we answer this question affirmatively by proposing a doubly stochastic G-S algorithm that is provably linearly convergent (in the mean square error sens… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
16
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(16 citation statements)
references
References 25 publications
0
16
0
Order By: Relevance
“…where u 2 X * X = u * X * Xu = Xu 2 2 , here X * X is an Hermitian positive definite matrix. For the comparative analysis of the RK and RGS algorithms, the readers can refer to [8,18] and generalized to other setting [7,25].…”
Section: Rgs Algorithmmentioning
confidence: 99%
“…where u 2 X * X = u * X * Xu = Xu 2 2 , here X * X is an Hermitian positive definite matrix. For the comparative analysis of the RK and RGS algorithms, the readers can refer to [8,18] and generalized to other setting [7,25].…”
Section: Rgs Algorithmmentioning
confidence: 99%
“…To further develop the RGS method for more general matrix, inspired by the randomized extended Kaczmarz (REK) method [7], Ma et al [5] presented a variant of the RGS mehtod, i.e., randomized extended Gauss-Seidel (REGS) method, and proved that the REGS method converges to x ⋆ regardless of whether the matrix A has full column rank. After that, many variants of the RGS (or REGS) method were developed and studied extensively; see for example [8][9][10][11][12][13][14] and references therein.…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by a work of Strohmer and Vershynin 4 which shows that the randomized Kaczmarz method converges linearly in expectation to the solution, Leventhal and Lewis 5 obtained a similar result for the randomized Gauss-Seidel (RGS) method, which is also called the randomized coordinate descent method. This method works on the columns of the matrix to minimize ‖ − ‖ 2 2 randomly according to an appropriate probability distribution and has attracted much attention recently due to its better performance; see for example [6][7][8][9][10][11][12][13][14] and references therein.…”
Section: Introductionmentioning
confidence: 99%