For nonparametric regression with one-sided errors and a boundary curve model for Poisson point processes we consider the problem of efficient estimation for linear functionals. The minimax optimal rate is obtained by an unbiased estimation method which nevertheless depends on a Hölder condition or monotonicity assumption for the underlying regression or boundary function.We first construct a simple blockwise estimator and then build up a nonparametric maximum-likelihood approach for exponential noise variables and the point process model. In that approach also non-asymptotic efficiency is obtained (UMVU: uniformly minimum variance among all unbiased estimators).The proofs rely essentially on martingale stopping arguments for counting processes and the point process geometry. The estimators are easily computable and a small simulation study confirms their applicability.
Consider a nonparametric regression model with one-sided errors and regression function in a general Hölder class. We estimate the regression function via minimization of the local integral of a polynomial approximation. We show uniform rates of convergence for the simple regression estimator as well as for a smooth version. These rates carry over to mean regression models with a symmetric and bounded error distribution. In such a setting, one obtains faster rates for irregular error distributions concentrating sufficient mass near the endpoints than for the usual regular distributions. The results are applied to prove asymptotic √ n-equivalence of a residual-based (sequential) empirical distribution function to the (sequential) empirical distribution function of unobserved errors in the case of irregular error distributions. This result is remarkably different from corresponding results in mean regression with regular errors. It can readily be applied to develop goodness-of-fit tests for the error distribution. We present some examples and investigate the small sample performance in a simulation study. We further discuss asymptotically distribution-free hypotheses tests for independence of the error distribution from the points of measurement and for monotonicity of the boundary function as well.
We consider a nonparametric autoregression model under conditional heteroscedasticity with the aim to test whether the innovation distribution changes in time. To this end, we develop an asymptotic expansion for the sequential empirical process of nonparametrically estimated innovations (residuals). We suggest a Kolmogorov–Smirnov statistic based on the difference of the estimated innovation distributions built from the first ⌊ns⌋and the last n − ⌊ns⌋ residuals, respectively (0 ≤ s ≤ 1). Weak convergence of the underlying stochastic process to a Gaussian process is proved under the null hypothesis of no change point. The result implies that the test is asymptotically distribution‐free. Consistency against fixed alternatives is shown. The small sample performance of the proposed test is investigated in a simulation study and the test is applied to a data example.
In the context of nonparametric regression models with one-sided errors, we consider parametric transformations of the response variable in order to obtain independence between the errors and the covariates. In view of estimating the tranformation parameter, we use a minimum distance approach and show the uniform consistency of the estimator under mild conditions. The boundary curve, i.e. the regression function, is estimated applying a smoothed version of a local constant approximation for which we also prove the uniform consistency. We deal with both cases of random covariates and deterministic (fixed) design points. To highlight the applicability of the procedures and to demonstrate their performance, the small sample behavior is investigated in a simulation study using the so-called Yeo-Johnson transformations.which are typically considered for ϑ ∈ Θ = [0, 2] because then they are bijective maps Λ ϑ :The class of sinh-arcsinh transformations, see Jones and Pewsey (2009), do shift the location, but they can be modified to fulfill Λ ϑ (0) = 0 for all ϑ ∈ Θ, e.g. consider Λ (ϑ 1 ,ϑ 2 ) (y) = sinh(ϑ 1 sinh −1 (y) − ϑ 2 ) − sinh(−ϑ 2 ).Here ϑ 1 > 0 is the tailweight parameter and ϑ 2 ∈ R the skewness parameter. These transformations define also bijective maps Λ (ϑ 1 ,ϑ 2 ) : R → R.
In this paper, we consider autoregressive models with conditional autoregressive variance, including the case of homoscedastic AR models and the case of ARCH models. Our aim is to test the hypothesis of normality for the innovations in a completely non‐parametric way, that is, without imposing parametric assumptions on the conditional mean and volatility functions. To this end, the Cramér–von Mises test based on the empirical distribution function of non‐parametrically estimated residuals is shown to be asymptotically distribution‐free. We demonstrate its good performance for finite sample sizes in a small simulation study. AMS 2010 Classification: Primary 62 M10, Secondary 62 G10
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.