We explain some basic theoretical issues regarding nonparametric statistics applied to inverse problems. Simple examples are used to present classical concepts such as the white noise model, risk estimation, minimax risk, model selection and optimal rates of convergence, as well as more recent concepts such as adaptive estimation, oracle inequalities, modern model selection methods, Stein's unbiased risk estimation and the very recent risk hull method.
We consider a heteroscedastic sequence space setup with polynomially increasing variances of observations that allows to treat a number of inverse problems, in particular multivariate ones. We propose an adaptive estimator that attains simultaneously exact asymptotic minimax constants on every ellipsoid of functions within a wide scale (that includes ellipoids with polynomially and exponentially decreasing axes) and, at the same time, satisfies asymptotically exact oracle inequalities within any class of linear estimates having monotone non-increasing weights. The construction of the estimator is based on a properly penalized blockwise Stein's rule, with weakly geometically increasing blocks. As an application, we construct sharp adaptive estimators in the problems of deconvolution and tomography.
We consider a sequence space model of statistical linear inverse problems where we need to estimate a function f from indirect noisy observations. Let a nite set of linear estimators be given. Our aim is to mimic the estimator in that has the smallest risk on the true f. Under general conditions, we show that this can be achieved by simple minimization of unbiased risk estimator, provided the singular values of the operator of the inverse problem decrease as a power law. The main result is a nonasymptotic oracle inequality that is shown to be asymptotically exact. This inequality can be also used to obtain sharp minimax adaptive results. In particular, we apply it to show that minimax adaptation on ellipsoids in multivariate anisotropic case is realized by minimization of unbiased risk estimator without any loss of e ciency with respect to optimal non-adaptive procedures.
We study a standard method of regularization by projections of the linear inverse problem Y = Af + ǫ, where ǫ is a white Gaussian noise, and A is a known compact operator with singular values converging to zero with polynomial decay. The unknown function f is recovered by a projection method using the singular value decomposition of A. The bandwidth choice of this projection regularization is governed by a data-driven procedure which is based on the principle of risk hull minimization. We provide nonasymptotic upper bounds for the mean square risk of this method and we show, in particular, that in numerical simulations this approach may substantially improve the classical method of unbiased risk estimation.
Consider an inverse problem with random noise where we want to estimate a function f . Moreover, suppose that the operator A that we need to invert is not completely known: we know its eigenfunctions and observe its singular values but with some noise. To construct our estimator θ , we minimize a modification of an unbiased risk estimator. We obtain some non-asymptotic exact oracle inequalities. Considering smooth functions in some standard classes of functions, we prove that θ is asymptotically minimax among a given class of estimators.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.