2018
DOI: 10.1214/17-aihp832
|View full text |Cite
|
Sign up to set email alerts
|

Variational multiscale nonparametric regression: Smooth functions

Abstract: For the problem of nonparametric regression of smooth functions, we reconsider and analyze a constrained variational approach, which we call the MultIscale Nemirovski-Dantzig (MIND) estimator. This can be viewed as a multiscale extension of the Dantzig selector (Ann. Statist., 35(6): 2313-51, 2009) based on early ideas of Nemirovski (J. Comput. System Sci., 23:1-11, 1986). MIND minimizes a homogeneous Sobolev norm under the constraint that the multiresolution norm of the residual is bounded by a universal thre… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 13 publications
(34 citation statements)
references
References 89 publications
(92 reference statements)
0
34
0
Order By: Relevance
“…which can be shown to be 2-normal, see Grasmair et al (2018). (ii) the scale penalties s I satisfy almost surely that sup I∈I |s I | ≤ δ log n for some constant δ > 0.…”
Section: Multiscale Change-point Segmentationmentioning
confidence: 99%
“…which can be shown to be 2-normal, see Grasmair et al (2018). (ii) the scale penalties s I satisfy almost surely that sup I∈I |s I | ≤ δ log n for some constant δ > 0.…”
Section: Multiscale Change-point Segmentationmentioning
confidence: 99%
“…Such hybrid approaches have been introduced independently in [18,34,70,89] (see also [21,24]). It is also related to the Dantzig and multiscale estimators of [19,41,43,50,56,74].…”
Section: Variational Characterizations and Extensionsmentioning
confidence: 99%
“…To the best of our knowledge, the idea of polyhedral estimate goes back to [34], see also [33,Chapter 2], where it was shown that when recovering smooth multivariate regression functions known to belong to Sobolev balls from their noisy observations taken along a regular grid Γ, a polyhedral estimate with ad hoc selected contrast matrix is near-optimal in a wide range of smoothness characterizations and norms · . Recently, the ideas underlying the results of [34] have been taken up in the MIND estimator of [18], then applied in the indirect observation setting in [37] in the context of multiple testing. The goal of this paper is to investigate characteristics of the polyhedral estimate, with a particular emphasis on efficiently computable upper bounds for the risk of the estimate w H (·) and design of the contrast matrix H resulting in the (nearly) best upper risk bounds.…”
Section: Polyhedral Estimatementioning
confidence: 99%
“…Risk bound (20) allows for a straightforward design of contrast matrices. Recalling that Ψ is monotone on the nonnegative orthant, all we need is to select h 's satisfying (18) and resulting in the smallest possible ς 's, which is what we are about to do now.…”
Section: Specifying Contrastsmentioning
confidence: 99%
See 1 more Smart Citation