2019
DOI: 10.3390/en12193665
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Feature Selection and Short-Term Price Forecasting Based on a Decision Tree (J48) Model

Abstract: A novel feature selection method based on a decision tree (J48) for price forecasting is proposed in this work. The method uses a genetic algorithm along with a decision tree classifier to obtain the minimum number of features giving an optimum forecast accuracy. The usefulness of the proposed approach is established through the performance test of the forecaster using the feature selected by this approach. It is found that the forecast with the selected feature consistently out-performed than that having larg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…The usefulness of the approach is established through the performance test of the forecaster using the feature selected by this approach. It is found out that the forecast with the reduced features consistently outperformed a more extensive feature set [43].…”
Section: Feature Selection-based Classification Algorithmsmentioning
confidence: 94%
“…The usefulness of the approach is established through the performance test of the forecaster using the feature selected by this approach. It is found out that the forecast with the reduced features consistently outperformed a more extensive feature set [43].…”
Section: Feature Selection-based Classification Algorithmsmentioning
confidence: 94%
“…Ciabattoni et al (2015) proposed a univariate filter method based on the Bayes error rate for feature selection in fault detection. Srivastava et al (2019) proposed a novel feature selection method based on a price prediction decision tree. This method used genetic algorithms and decision tree classifiers to obtain the smallest number of features for the best prediction accuracy.…”
Section: Feature Selectionmentioning
confidence: 99%
“…In step 5, based on the value of the gain ratio, the attribute having the highest value is declared root node, and the same computation is repeated from step 1 to step 4 for intermediate nodes till all the instances are exhausted and reach the leaf node as per step 2 [33].…”
Section: J48 Classifiermentioning
confidence: 99%