2000
DOI: 10.1023/a:1007602715810
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Abstract.We consider the problem of one-step ahead prediction for time series generated by an underlying stationary stochastic process obeying the condition of absolute regularity, describing the mixing nature of process. We make use of recent results from the theory of empirical processes, and adapt the uniform convergence framework of Vapnik and Chervonenkis to the problem of time series prediction, obtaining finite sample bounds. Furthermore, by allowing both the model complexity and memory size to be adapt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2002
2002
2013
2013

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 62 publications
(3 citation statements)
references
References 46 publications
0
3
0
Order By: Relevance
“…Statistical MTS modelling methods include the Vector Auto-Regressive process [Lütkepohl 1993], the Vector Auto-Regressive Moving Average process [Lütkepohl 1993], and other non-linear [Casdagli 1992] and Bayesian approaches [Pole et al 1994], while various AI methods have been developed for different purposes including dependence detection in MTS of categorical data [Oates et al 1999], knowledge based temporal abstraction [Kadous 1999, Shahar 1997, and forecasting [Meir 2000, Weigend 1994]. However, one area that has been largely overlooked is the particular type of time series where the data set consists of a large number of variables but with a small number of observations.…”
Section: Introductionmentioning
confidence: 99%
“…Statistical MTS modelling methods include the Vector Auto-Regressive process [Lütkepohl 1993], the Vector Auto-Regressive Moving Average process [Lütkepohl 1993], and other non-linear [Casdagli 1992] and Bayesian approaches [Pole et al 1994], while various AI methods have been developed for different purposes including dependence detection in MTS of categorical data [Oates et al 1999], knowledge based temporal abstraction [Kadous 1999, Shahar 1997, and forecasting [Meir 2000, Weigend 1994]. However, one area that has been largely overlooked is the particular type of time series where the data set consists of a large number of variables but with a small number of observations.…”
Section: Introductionmentioning
confidence: 99%
“…While nonparametric methods that are proven to work in a certain statistical sense different from (2) for arbitrary, unknown stationary ergodic processes (Z i ) exist [5], these methods require very large data segments for acceptable precision. Approaches for forecasting goals closer to (2) were considered by, e.g., [6,7] but unfortunately these methods require certain mixing conditions that cannot be satisfied by dynamical systems.…”
mentioning
confidence: 99%
“…We compute SVM forecasters using memories of increasing length and for training data of two different sizes n = 800 and n = 1600. Since in estimating the convergence speed of (7) we are using a loose concentration result, a suitable regularization parameter and kernel width sequence cannot be chosen a-priori. Hence, we have adopted a grid search in (λ, γ) space and a 4-fold cross-validation technique [5] to choose (λ n , γ n ) for a given sample size n. Finally, we use (λ n , γ n ) for an estimate f n,1,1,λn,γn constructed from (6) using the whole sample T n (to simplify notation, we henceforward omit the dependence of f n,1,1 on the regularization parameters (λ n , γ n )).…”
mentioning
confidence: 99%