The 2006 IEEE International Joint Conference on Neural Network Proceedings 2006
DOI: 10.1109/ijcnn.2006.247129
|View full text |Cite
|
Sign up to set email alerts
|

Data partition and variable selection for time series prediction using wrappers

Abstract: The purpose of this paper is a comparative study of a non-exhaustive, though representative, set of methodologies already available for the partition of the training dataset in time series prediction, and also for variable selection under the wrapper paradigm. The partition policy of the training dataset and the choice of a proper set of variables for the regression vector are known to have a significant influence in the accuracy of the predictor, no matter the choice of the prediction model. However, there ha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…The Mutual Information (MI) method is a type of filter approach to subset selection. Filters are characterized by data-based methods, as the subset is chosen according to some correlation between the input candidate and the target [45,46].…”
Section: Mutual Informationmentioning
confidence: 99%
See 1 more Smart Citation
“…The Mutual Information (MI) method is a type of filter approach to subset selection. Filters are characterized by data-based methods, as the subset is chosen according to some correlation between the input candidate and the target [45,46].…”
Section: Mutual Informationmentioning
confidence: 99%
“…The wrapper's main advantage is that the model is taken into account to analyze the influence of each input. However, to be model-dependent, its computational effort tends to be elevated, as the model has to be adjusted to each candidate subset [46].…”
Section: Wrappersmentioning
confidence: 99%
“…The variable selection methodologies can use information available a priori, through empirical tests of trial and error, or some information criterion. Puma-Villanueva et al [47] describe a simple example of how the general process works. Consider set V that represents the space of input variables, here limited to 3.…”
Section: Variable Selectionmentioning
confidence: 99%
“…For example, a relevant variable does not mean that the optimal subset contains it. Likewise, the inputs that belong to the optimal subset are not necessarily appropriate [47]. Guyon and Elisseeff [17] classify the VS models into embedded, wrappers, and filters, each of them having its own advantages.…”
Section: Subsetsmentioning
confidence: 99%
“…Spectral analysis [1] and autocorrelation analysis [1] are examples of filter methods. Wrappers select the features based on the performance of computational intelligent methods [19], [11]. Filters seems to be more suitable to select features for linear models prediction.…”
Section: B Lag Selectionmentioning
confidence: 99%