2010
DOI: 10.1007/s10092-010-0027-4
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of parameter choices in regularization algorithms in case of different information about noise level

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
26
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 29 publications
(27 citation statements)
references
References 23 publications
1
26
0
Order By: Relevance
“…We made the computations on the sequence of parameters i D r i , i D 0; 1; : : : and r D 1:1. The parameter˛i is found as the minimizer of the corresponding functions N g k .˛/ (see (5.2)) in the interval OE˛; 1, where˛is the largest˛i for which the value of N g k .˛i / is C times larger than its value at its current minimum (see [8,10,24]). We used the value C D 1:2, in addition we found the first local minimum (corresponding to C D 1) and the global minimum (corresponding to C D 1).…”
Section: Convergence and Quasi-optimality Of Family Of Rulesmentioning
confidence: 99%
See 1 more Smart Citation
“…We made the computations on the sequence of parameters i D r i , i D 0; 1; : : : and r D 1:1. The parameter˛i is found as the minimizer of the corresponding functions N g k .˛/ (see (5.2)) in the interval OE˛; 1, where˛is the largest˛i for which the value of N g k .˛i / is C times larger than its value at its current minimum (see [8,10,24]). We used the value C D 1:2, in addition we found the first local minimum (corresponding to C D 1) and the global minimum (corresponding to C D 1).…”
Section: Convergence and Quasi-optimality Of Family Of Rulesmentioning
confidence: 99%
“…Numerical examples in Section 7 show that the rules from the proposed family work well also in the case the noise level is under-or overestimated many times. Note that extensive numerical comparison of other parameter choice rules can be found in [3,[8][9][10]24]. …”
Section: Introductionmentioning
confidence: 99%
“…Investigation of the case 3 is motivated by the fact that classical rules for the choice of the regularization parameter that use the noise level (e.g. the discrepancy principle [18,26]) need an exact noise level: they fail in case of underestimated noise level and give large error in case of very moderate overestimation of the noise level (see [8]). On the other hand, for heuristic rules that do not use the noise level, the convergence of approximate solutions as y − y * → 0 can not be guaranteed (see [1]).…”
Section: Ill-posed Problems and The Noise Level Information Of The Datamentioning
confidence: 99%
“…. ; q < 1, for example, q = 0.9; find α as the first α i for which From [8,10,11,20] it is known that this rule guarantees the convergence x α − x * → 0 as δ → 0, provided that lim y−y * /δ ≤ const, and the order optimal error estimate x α −x * ≤ constδ p/(p+1) holds for x * −x 0 ∈ R((A * A) p/2 ) for all p ≤ 2m, if y − y * ≤ δ and c 1 ≥ 0.24.…”
Section: Rules For Very Approximate Noise Levelmentioning
confidence: 99%
See 1 more Smart Citation