2014
DOI: 10.1137/130917673
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Arnoldi--Tikhonov Method for Sparse Reconstruction

Abstract: This paper introduces two new algorithms, belonging to the class of Arnoldi--Tikhonov regularization methods, which are particularly appropriate for sparse reconstruction. The main idea is to consider suitable adaptively defined regularization matrices that allow the usual 2-norm regularization term to approximate a more general regularization term expressed in the $p$-norm, $p\geq 1$. The regularization matrix can be updated both at each step and after some iterations have been performed, leading to two diffe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
97
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 59 publications
(97 citation statements)
references
References 38 publications
0
97
0
Order By: Relevance
“…In [16,28,29] a generalized Arnoldi-Tikhonov method (GAT) was introduced that iteratively solves the Tikhonov problem (1) using a Krylov subspace method based on the Arnoldi decomposition of the matrix A. Simultaneously, after each Krylov iteration, the regularization parameter is updated in order to approximate the value for which the discrepancy is equal to ηε.…”
Section: Reference Method: Generalized Bidiagonal-tikhonovmentioning
confidence: 99%
See 1 more Smart Citation
“…In [16,28,29] a generalized Arnoldi-Tikhonov method (GAT) was introduced that iteratively solves the Tikhonov problem (1) using a Krylov subspace method based on the Arnoldi decomposition of the matrix A. Simultaneously, after each Krylov iteration, the regularization parameter is updated in order to approximate the value for which the discrepancy is equal to ηε.…”
Section: Reference Method: Generalized Bidiagonal-tikhonovmentioning
confidence: 99%
“…where r(z k ) = ||B k+1,k z k − c k+1 || and r(y k ) = ||B k+1,k y k − c k+1 || are the residuals. A brief sketch of this method is given in algorithm 3, but for more information we refer to [16,28,29]. Note that in the original GAT method, the non-regularized iterates z k are equivalent to the GMRES [30] iterations for the solution of Ax = b.…”
Section: Reference Method: Generalized Bidiagonal-tikhonovmentioning
confidence: 99%
“…In [26], a flexible Arnoldi Tikhonov method was presented for sparse image reconstruction, where a procedure for adapting the regularisation parameter was demonstrated within the solution of a linear least squares problem. However, this procedure differs from the approach proposed in this paper, in the sense that a discrepancy principle was used in [26] with a formulation to recursively change λ k until the data residual fall just above the noise level that is assumed to be known in the data. Once this balance is achieved, no more changes in λ k are observed, which prompted the authors to consider a restarting procedure to be implemented within the algorithm.…”
Section: Adaptive Gauss Newton Methodsmentioning
confidence: 99%
“…In addition, most of the theory and methods have been developed for standard‐form Tikhonov and extensions of hybrid iterative methods to the general‐form Tikhonov problem may be non‐trivial, especially if WL does not have full rank . Various researchers have proposed hybrid approaches based on the Arnoldi process , generalized Krylov bases , and other basis extensions . However, a fully automated implementation that works for extremely large‐scale problems is not readily available.…”
Section: Methodsmentioning
confidence: 99%