2002
DOI: 10.1214/aos/1028674843
|View full text |Cite
|
Sign up to set email alerts
|

Oracle inequalities for inverse problems

Abstract: We consider a sequence space model of statistical linear inverse problems where we need to estimate a function f from indirect noisy observations. Let a nite set of linear estimators be given. Our aim is to mimic the estimator in that has the smallest risk on the true f. Under general conditions, we show that this can be achieved by simple minimization of unbiased risk estimator, provided the singular values of the operator of the inverse problem decrease as a power law. The main result is a nonasymptotic orac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
146
0

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 133 publications
(152 citation statements)
references
References 27 publications
6
146
0
Order By: Relevance
“…We refer the reader to the paper of Kneip (1993) for an extensive bibliography on data-driven choice of smoothing parameters. More recent references are (Donoho and Johnstone 1995;Birgé and Massart 2001;Cavalier, Golubev, Picard, and Tsybakov 2002).…”
Section: It Follows Thatmentioning
confidence: 99%
“…We refer the reader to the paper of Kneip (1993) for an extensive bibliography on data-driven choice of smoothing parameters. More recent references are (Donoho and Johnstone 1995;Birgé and Massart 2001;Cavalier, Golubev, Picard, and Tsybakov 2002).…”
Section: It Follows Thatmentioning
confidence: 99%
“…This estimator is adaptive both over Sobolev and analytic scales. In [10] the data-driven choice of the regularizing parameters is based on unbiased risk estimation. The authors consider projection estimators and derive the corresponding oracle inequalities.…”
Section: Introductionmentioning
confidence: 99%
“…Projection methods, which are defined as solutions of (1) restricted on finite dimensional subspaces H N and K N (of dimension N ) also give rise to attractive approximations of f , by properly choosing the subspaces and the tuning parameter N (Dicken and Maass [10], Mathe and Pereversseh [31] together with their non linear counterparts Cavalier and Tsybakov [7], Cavalier et al [6], Tsybakov [40], Goldenschluger and Pereversev [19], Efromovich and Kolchinskii [16]. In the case where H = K, and K is a self adjoint operator, the system is particularly simple to solve since the restricted operator K N is symmetric positive definite.…”
Section: Projection Methodsmentioning
confidence: 99%