2016
DOI: 10.1007/s10543-016-0637-6
|View full text |Cite
|
Sign up to set email alerts
|

Efficient estimation of regularization parameters via downsampling and the singular value expansion

Abstract: Abstract. The solution, x, of the linear system of equations Ax ≈ b arising from the discretization of an ill-posed integral equation with a square integrable kernel H(s, t) is considered. The Tikhonov regularized solution x(λ) is found as the minimizer of}. x(λ) depends on regularization parameter λ that trades off the data fidelity, and on the smoothing norm determined by L. Here we consider the case where L is diagonal and invertible, and employ the Galerkin method to provide the relationship between the si… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 27 publications
0
6
0
Order By: Relevance
“…Future work includes the natural extension of the new class of algorithms to work with Krylov projection methods that are based on algorithms other than GKB (e.g., the Arnoldi algorithm or flexible Krylov methods; see [5]): while the computations involved in Algorithm 3 can be adapted to these situations, the theoretical analysis of the resulting strategies needs to be carefully rethought. Moreover, the new class of algorithms can be extended to handle Tikhonov-TSVD regularization, i.e., regularization methods that apply Tikhonov method to a TSVDprojected linear system (see, e.g., [27]): in these cases, one should replace the Krylov space K k appearing in Algorithm 3 by the space spanned by the first k right singular vectors of A; a careful theoretical analysis would be needed to prove convergence results. Also, other parameter choice strategies that can be expressed in the framework of bilevel optimization problems (e.g., the UPRE criterion, see [27] and the references therein) can be considered.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…Future work includes the natural extension of the new class of algorithms to work with Krylov projection methods that are based on algorithms other than GKB (e.g., the Arnoldi algorithm or flexible Krylov methods; see [5]): while the computations involved in Algorithm 3 can be adapted to these situations, the theoretical analysis of the resulting strategies needs to be carefully rethought. Moreover, the new class of algorithms can be extended to handle Tikhonov-TSVD regularization, i.e., regularization methods that apply Tikhonov method to a TSVDprojected linear system (see, e.g., [27]): in these cases, one should replace the Krylov space K k appearing in Algorithm 3 by the space spanned by the first k right singular vectors of A; a careful theoretical analysis would be needed to prove convergence results. Also, other parameter choice strategies that can be expressed in the framework of bilevel optimization problems (e.g., the UPRE criterion, see [27] and the references therein) can be considered.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, the new class of algorithms can be extended to handle Tikhonov-TSVD regularization, i.e., regularization methods that apply Tikhonov method to a TSVDprojected linear system (see, e.g., [27]): in these cases, one should replace the Krylov space K k appearing in Algorithm 3 by the space spanned by the first k right singular vectors of A; a careful theoretical analysis would be needed to prove convergence results. Also, other parameter choice strategies that can be expressed in the framework of bilevel optimization problems (e.g., the UPRE criterion, see [27] and the references therein) can be considered. To conclude, the 44)).…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations