2017
DOI: 10.1214/17-ejs1238
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive wavelet multivariate regression with errors in variables

Abstract: In the multidimensional setting, we consider the errors-in-variables model. We aim at estimating the unknown nonparametric multivariate regression function with errors in the covariates. We devise an adaptive estimator based on projection kernels on wavelets and a deconvolution operator. We propose an automatic and fully data driven procedure to select the wavelet level resolution. We obtain an oracle inequality and optimal rates of convergence over anisotropic Hölder classes. Our theoretical results are illus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 22 publications
0
8
0
Order By: Relevance
“…In [6] the authors adapt the kernel approach proposed by [7] in their wavelet context and they introduced…”
Section: Approximation Kernels and Family Of Estimators For Nonparamementioning
confidence: 99%
See 2 more Smart Citations
“…In [6] the authors adapt the kernel approach proposed by [7] in their wavelet context and they introduced…”
Section: Approximation Kernels and Family Of Estimators For Nonparamementioning
confidence: 99%
“…Because Beta (2; 2) and Beta (0:5; 2) distributions re ‡ect two di¤erent behaviors on [0; 1] we use them. Finally, following the asymptotic considerations given in [6], we choose the primary resolution level j that we have used throughout our simulations as j(n) = log 2 (log(n)) + 1.…”
Section: Simulation Studymentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore a data-driven bandwidth selection rule is also constructed in this paper, using the Goldenshluger and Lespki (GL) approach [17], which has been applied in various contexts, see for instance, Comte et al [11], Comte and Lacour [9], Doumic et al [13], Reynaud-Bouret et al [25] who apply GL method in kernel density estimation, and Bertin et al [3], Chagny [6], Chichignoud et al [7] or Comte and Rebafka [12]. Our selection rule is then adaptive to unknown smoothness of the target function, which is new in this context.…”
Section: Introductionmentioning
confidence: 99%
“…setting, the conditions adopted in Fan & Masry (1992) and Masry (2017) are weakened in Ioannides & Alevizos (1997). To remove the constraints that both the regression function and the density function of the signal variable belong to the same known smoothness class, Comte & Taupin (2007), Chesneau (2010) and Chichignoud et al (2017) proposed some adaptive estimation procedures based on projection methods, while Comte & Lacour (2013) considered a more interesting case in which the multivariate covariates are contaminated with mixed measurement errors. Predictions and confidence intervals were investigated in Delaigle, Hall & Jamshidi (2015) and Carroll, Delaigle & Hall (2009).…”
Section: Introductionmentioning
confidence: 99%