This article presents deflation strategies related to recycling Krylov subspace methods for solving one or a sequence of linear systems of equations. Besides well-known strategies of deflation, Ritz-, and harmonic Ritz-based deflation, we introduce an Singular Value Decomposition based deflation technique. We consider the recycling in two contexts: recycling the Krylov subspace between the restart cycles and recycling a deflation subspace when the matrix changes in a sequence of linear systems. Numerical experiments on real-life reservoir simulation demonstrate the impact of our proposed strategy.
In this paper we present a multilevel preconditioner based on overlapping Schwarz methods for symmetric positive definite (SPD) matrices. Robust two-level Schwarz preconditioners exist in the literature to guarantee fast convergence of Krylov methods. As long as the dimension of the coarse space is reasonable, that is, exact solvers can be used efficiently, two-level methods scale well on parallel architectures. However, the factorization of the coarse space matrix may become costly at scale. An alternative is then to use an iterative method on the second level, combined with an algebraic preconditioner, such as a one-level additive Schwarz preconditioner. Nevertheless, the condition number of the resulting preconditioned coarse space matrix may still be large. One of the difficulties of using more advanced methods, like algebraic multigrid or even two-level overlapping Schwarz methods, to solve the coarse problem is that the matrix does not arise from a partial differential equation (PDE) anymore. We introduce in this paper a robust multilevel additive Schwarz preconditioner where at each level the condition number is bounded, ensuring a fast convergence for each nested solver. Furthermore, our construction does not require any additional information than for building a two-level method, and may thus be seen as an algebraic extension.
The Tensor-Train (TT) format is a highly compact low-rank representation for highdimensional tensors. TT is particularly useful when representing approximations to the solutions of certain types of parametrized partial differential equations. For many of these problems, computing the solution explicitly would require an infeasible amount of memory and computational time. While the TT format makes these problems tractable, iterative techniques for solving the PDEs must be adapted to perform arithmetic while maintaining the implicit structure. The fundamental operation used to maintain feasible memory and computational time is called rounding, which truncates the internal ranks of a tensor already in TT format. We propose several randomized algorithms for this task that are generalizations of randomized low-rank matrix approximation algorithms and provide significant reduction in computation compared to deterministic TT-rounding algorithms. Randomization is particularly effective in the case of rounding a sum of TT-tensors (where we observe 20× speedup), which is the bottleneck computation in the adaptation of GMRES to vectors in TT format. We present the randomized algorithms and compare their empirical accuracy and computational time with deterministic alternatives.
Randomized methods are becoming increasingly popular in numerical linear algebra. However, few attempts have been made to use them in developing preconditioners. Our interest lies in solving large-scale sparse symmetric positive definite linear systems of equations where the system matrix is preordered to doubly bordered block diagonal form (for example, using a nested dissection ordering). We investigate the use of randomized methods to construct high quality preconditioners.In particular, we propose a new and efficient approach that employs Nyström's method for computing low rank approximations to develop robust algebraic two-level preconditioners. Construction of the new preconditioners involves iteratively solving a smaller but denser symmetric positive definite Schur complement system with multiple right-hand sides. Numerical experiments on problems coming from a range of application areas demonstrate that this inner system can be solved cheaply using block conjugate gradients and that using a large convergence tolerance to limit the cost does not adversely affect the quality of the resulting Nyström-Schur two-level preconditioner.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.