2023
DOI: 10.3390/math11030640
|View full text |Cite
|
Sign up to set email alerts
|

An Aggregation-Based Algebraic Multigrid Method with Deflation Techniques and Modified Generic Factored Approximate Sparse Inverses

Abstract: In this paper, we examine deflation-based algebraic multigrid methods for solving large systems of linear equations. Aggregation of the unknown terms is applied for coarsening, while deflation techniques are proposed for improving the rate of convergence. More specifically, the V-cycle strategy is adopted, in which, at each iteration, the solution is computed by initially decomposing it utilizing two complementary subspaces. The approximate solution is formed by combining the solution obtained using multigrids… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…Approximation using a sparse linear combination of elements from a fixed redundant family is actively used because of its concise representations and increased computational efficiency. It has been applied widely to signal processing, image compression, machine learning and PDE solvers (see [1][2][3][4][5][6][7][8][9][10]). Among others, simultaneous sparse approximation has been utilized in signal vector processing and multi-task learning (see [11][12][13][14]).…”
Section: Introductionmentioning
confidence: 99%
“…Approximation using a sparse linear combination of elements from a fixed redundant family is actively used because of its concise representations and increased computational efficiency. It has been applied widely to signal processing, image compression, machine learning and PDE solvers (see [1][2][3][4][5][6][7][8][9][10]). Among others, simultaneous sparse approximation has been utilized in signal vector processing and multi-task learning (see [11][12][13][14]).…”
Section: Introductionmentioning
confidence: 99%