Tikhonov regularization is a popular approach to obtain a meaningful solution for ill-conditioned linear least squares problems. A relatively simple way of choosing a good regularization parameter is given by Morozov's discrepancy principle. However, most approaches require the solution of the Tikhonov problem for many different values of the regularization parameter, which is computationally demanding for large scale problems. We propose a new and efficient algorithm which simultaneously solves the Tikhonov problem and finds the corresponding regularization parameter such that the discrepancy principle is satisfied. We achieve this by formulating the problem as a nonlinear system of equations and solving this system using a line search method. We obtain a good search direction by projecting the problem onto a low dimensional Krylov subspace and computing the Newton direction for the projected problem. This projected Newton direction, which is significantly less computationally expensive to calculate than the true Newton direction, is then combined with a backtracking line search to obtain a globally convergent algorithm, which we refer to as the Projected Newton method. We prove convergence of the algorithm and illustrate the improved performance over current state-of-the-art solvers with some numerical experiments.
Pipelined Krylov subspace methods (also referred to as communication-hiding methods) have been proposed in the literature as a scalable alternative to classic Krylov subspace algorithms for iteratively computing the solution to a large linear system in parallel. For symmetric and positive definite system matrices the pipelined Conjugate Gradient method, p(l)-CG, outperforms its classic Conjugate Gradient counterpart on large scale distributed memory hardware by overlapping global communication with essential computations like the matrix-vector product, thus "hiding" global communication. A well-known drawback of the pipelining technique is the (possibly significant) loss of numerical stability. In this work a numerically stable variant of the pipelined Conjugate Gradient algorithm is presented that avoids the propagation of local rounding errors in the finite precision recurrence relations that construct the Krylov subspace basis. The multi-term recurrence relation for the basis vector is replaced by three-term recurrences, improving stability without increasing the overall computational cost of the algorithm. The proposed modification ensures that the pipelined Conjugate Gradient method is able to attain a highly accurate solution independently of the pipeline length. Numerical experiments demonstrate a combination of excellent parallel performance and improved maximal attainable accuracy for the new pipelined Conjugate Gradient algorithm. This work thus resolves one of the major practical restrictions for the useability of pipelined Krylov subspace methods.
Choosing an appropriate regularization term is necessary to obtain a meaningful solution to an ill-posed linear inverse problem contaminated with measurement errors or noise. The ℓ p norm covers a wide range of choices for the regularization term since its behavior critically depends on the choice of p and since it can easily be combined with a suitable regularization matrix. We develop an efficient algorithm that simultaneously determines the regularization parameter and corresponding ℓ p regularized solution such that the discrepancy principle is satisfied. We project the problem on a low-dimensional generalized Krylov subspace and compute the Newton direction for this much smaller problem. We illustrate some interesting properties of the algorithm and compare its performance with other state-of-the-art approaches using a number of numerical experiments, with a special focus of the sparsity inducing ℓ 1 norm and edge-preserving total variation regularization.
We develop a computationally efficient algorithm for the automatic regularization of nonlinear inverse problems based on the discrepancy principle. We formulate the problem as an equality constrained optimization problem, where the constraint is given by a least squares data fidelity term and expresses the discrepancy principle. The objective function is a convex regularization function that incorporates some prior knowledge, such as the total variation regularization function. Using the Jacobian matrix of the nonlinear forward model, we consider a sequence of quadratically constrained optimization problems that can all be solved using the Projected Newton method. We show that the solution of such a quadratically constrained sub-problem results in a descent direction for an exact merit function. This merit function can then be used to describe a formal line-search method. We also formulate a slightly more heuristic approach that simplifies the algorithm and allows for an inexact solution of the sequence of sub-problems. We illustrate the behavior of the algorithm using a number of numerical experiments, with Talbot-Lau x-ray phase contrast imaging as the main application. The numerical experiments confirm that the quadratically constrained sub-problems need not be solved with high accuracy in early iterations to make sufficient progress towards the solution. In addition, we show that the proposed method is able to produce reconstructions of similar quality compared to other state-of-the-art approaches with a significant reduction in computational time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.