2017
DOI: 10.48550/arxiv.1711.11053
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Multi-Horizon Quantile Recurrent Forecaster

Abstract: We propose a framework for general probabilistic multi-step time series regression. Specifically, we exploit the expressiveness and temporal nature of Sequence-to-Sequence Neural Networks (e.g. recurrent and convolutional structures), the nonparametric nature of Quantile Regression and the efficiency of Direct Multi-Horizon Forecasting. A new training scheme, forking-sequences, is designed for sequential nets to boost stability and performance. We show that the approach accommodates both temporal and static co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
115
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 97 publications
(126 citation statements)
references
References 16 publications
0
115
0
Order By: Relevance
“…We analyze the generalization errors under multi-horizon and multi-quantile time series forecasting, characterized with quantities, such as the Rademachar complexity and discrepancy measure for stationarity (in Section 6). Under the state-of-theart Seq2Seq MQ-CNN model [29], extensive experiments on real-world datasets demonstrate the consistency and accuracy improvement of our methodology with I(S)QF layers over various other baseline layers, e.g., default quantile [29], Gaussian [7], and SQF [9] (in Section 7).…”
Section: Introductionmentioning
confidence: 79%
See 2 more Smart Citations
“…We analyze the generalization errors under multi-horizon and multi-quantile time series forecasting, characterized with quantities, such as the Rademachar complexity and discrepancy measure for stationarity (in Section 6). Under the state-of-theart Seq2Seq MQ-CNN model [29], extensive experiments on real-world datasets demonstrate the consistency and accuracy improvement of our methodology with I(S)QF layers over various other baseline layers, e.g., default quantile [29], Gaussian [7], and SQF [9] (in Section 7).…”
Section: Introductionmentioning
confidence: 79%
“…Fortunately, quantile regression [14,13], which has been successfully used for robustly modeling probabilistic outputs, comes to rescue. The incorporation of the quantile regression component to various sequential neural network backbones has been shown to be particularly effective with recent advances in deep learning [29,9,18,6]. Obtaining a full probabilistic prediction (i.e., the ability to query a forecast at an arbitrary quantile) usually requires generating multiple quantiles at once.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…State-space models (Durbin & Koopman, 2012;Hyndman et al, 2008a;West et al, 1985) provides a more general and interpretable framework for modeling time series by sequentially updating information to give better estimates; Kalman filter (Welch et al, 1995) and exponential smoothing (Hyndman et al, 2008a) are both prominent examples. Deep Neural Networks (Becker et al, 2019;Krishnan et al, 2015;Lai et al, 2018;Rangapuram et al, 2018;Wen et al, 2017) can improve the ability to model complex data with enough history. However, it is very difficult to obtain single model that works well in diverse situations.…”
Section: Related Workmentioning
confidence: 99%
“…However, it's difficult to obtain explicit formulas for modeling the correlation of different dimensions. Methods based on recurrent neural networks [12] or attention [13] usually do not explicitly model the correlation of the data in different dimensions, which limits the prediction performance of the model.…”
Section: Introductionmentioning
confidence: 99%