In this contribution we introduce weakly locally stationary time series through the local approximation of the non-stationary covariance structure by a stationary one. This allows us to define autoregression coefficients in a non-stationary context, which, in the particular case of a locally stationary Time Varying Autoregressive (TVAR) process, coincide with the generating coefficients. We provide and study an estimator of the time varying autoregression coefficients in a general setting. The proposed estimator of these coefficients enjoys an optimal minimax convergence rate under limited smoothness conditions. In a second step, using a bias reduction technique, we derive a minimax-rate estimator for arbitrarily smooth time-evolving coefficients, which outperforms the previous one for large data sets. In turn, for TVAR processes, the predictor derived from the estimator exhibits an optimal minimax prediction rate.
In this work, we study the problem of aggregating a finite number of predictors for nonstationary sub-linear processes. We provide oracle inequalities relying essentially on three ingredients: (1) a uniform bound of the ℓ 1 norm of the time varying sub-linear coefficients, (2) a Lipschitz assumption on the predictors and (3) moment conditions on the noise appearing in the linear representation. Two kinds of aggregations are considered giving rise to different moment conditions on the noise and more or less sharp oracle inequalities. We apply this approach for deriving an adaptive predictor for locally stationary time varying autoregressive (TVAR) processes. It is obtained by aggregating a finite number of well chosen predictors, each of them enjoying an optimal minimax convergence rate under specific smoothness conditions on the TVAR coefficients. We show that the obtained aggregated predictor achieves a minimax rate while adapting to the unknown smoothness. To prove this result, a lower bound is established for the minimax rate of the prediction risk for the TVAR process. Numerical experiments complete this study. An important feature of this approach is that the aggregated predictor can be computed recursively and is thus applicable in an online prediction context.
We address the problem of forecasting a time series meeting the Causal Bernoulli Shift model, using a parametric set of predictors. The aggregation technique provides a predictor with well established and quite satisfying theoretical properties expressed by an oracle inequality for the prediction risk. The numerical computation of the aggregated predictor usually relies on a Markov chain Monte Carlo method whose convergence should be evaluated. In particular, it is crucial to bound the number of simulations needed to achieve a numerical precision of the same order as the prediction risk. In this direction we present a fairly general result which can be seen as an oracle inequality including the numerical cost of the predictor computation. The numerical cost appears by letting the oracle inequality depend on the number of simulations required in the Monte Carlo approximation. Some numerical experiments are then carried out to support our findings. Problem statement and main assumptionsReal stable autoregressive processes of a fixed order, referred to as AR(d) processes, are one of the simplest examples of CBS. They are defined as the stationary solution of(2.1)where the (ξ t ) t∈Z are i.i.d. real random variables with E[ξ t ] = 0 and E[ξ 2 t ] = 1.
In this contribution we introduce weakly locally stationary time series through the local approximation of the non-stationary covariance structure by a stationary one. This allows us to define autoregression coefficients in a non-stationary context, which, in the particular case of a locally stationary Time Varying Autoregressive (TVAR) process, coincide with the generating coefficients. We provide and study an estimator of the time varying autoregression coefficients in a general setting. The proposed estimator of these coefficients enjoys an optimal minimax convergence rate under limited smoothness conditions. In a second step, using a bias reduction technique, we derive a minimax-rate estimator for arbitrarily smooth time-evolving coefficients, which outperforms the previous one for large data sets. In turn, for TVAR processes, the predictor derived from the estimator exhibits an optimal minimax prediction rate.2000 Mathematics Subject Classification. 62M20, 62G99, 62M10, 68W27. Key words and phrases. locally stationary time series, auto-regression coefficients, time varying autoregressive processes, minimax-rate prediction.This work has been partially supported by the Conseil régional d' Île-de-France under a doctoral allowance of its program Réseau de Recherche Doctoral en Mathématiques de l' Île de France (RDM-IdF) for the period 2012 -2015 and by the Labex LMH (ANR-11-IDEX-003-02).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.