2013
DOI: 10.1080/00036811.2013.833326
|View full text |Cite
|
Sign up to set email alerts
|

Regularization properties of the sequential discrepancy principle for Tikhonov regularization in Banach spaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
124
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 51 publications
(124 citation statements)
references
References 20 publications
0
124
0
Order By: Relevance
“…For the well-definedness of γ SDP from SDP, its properties and convergence of regularized solutions x δ γ SDP as δ → 0, we refer to [1]. In principle, one can say that γ SDP is uniquely determined for all 0 < q < 1 and y δ ∈ Y whenever γ 0 > 0 is large enough.…”
Section: Convergence Rates Under Variational Inequalitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…For the well-definedness of γ SDP from SDP, its properties and convergence of regularized solutions x δ γ SDP as δ → 0, we refer to [1]. In principle, one can say that γ SDP is uniquely determined for all 0 < q < 1 and y δ ∈ Y whenever γ 0 > 0 is large enough.…”
Section: Convergence Rates Under Variational Inequalitiesmentioning
confidence: 99%
“…We refer to [1,8,29] for further discussions. On the other hand, the application of Proposition 3.2 to the elastic-net regularization (1.5), where β is chosen by TDP and SDP, or LEP, requires us to construct an appropriate variational inequality.…”
Section: Application To Elastic-net Regularizationmentioning
confidence: 99%
“…For more details concerning the consequences of variational inequalities and the role of the choice of the regularization parameter for obtaining convergence rates in regularization we refer, for example, to [19] and [1,4,5,12,15,21]. Making use of Gelfand triples it was shown in [2] that, for a wide range of applied inverse problems, the forward operators A are such that link conditions of the form (3.1) apply for all e (k) , k ∈ N. On the other hand, the paper [13] gives counterexamples where (3.1) fails for specific operators A, but alternative link conditions presented there can compensate this deficit.…”
Section: Improved Convergence Ratesmentioning
confidence: 99%
“…Moreover, with focus on sparsity, the use of 1 -regularization can be motivated for specific classes of well-posed problems, too (cf., e.g., [10]). Based on the powerful tool of variational inequalities (also called variational source conditions), in [9] convergence rates results on 1 -regularization for linear ill-posed operator equations have been formulated in infinite dimensional spaces under the condition that the sparsity assumption slightly fails, but the solution is still in 1 . In the present paper, we improve those results and illustrate the improvement level with respect to the associated convergence rates for the Cesáro operator equation in 2 and for specific denoising problems.…”
Section: Introductionmentioning
confidence: 99%
“…In this situation, we obtain for a choice α = α(δ, y δ ) of the regularization parameter by the sequential discrepancy principle (cf. [1,21]) convergence rates…”
Section: Introductionmentioning
confidence: 99%