2021 9th International Conference on Cyber and IT Service Management (CITSM) 2021
DOI: 10.1109/citsm52892.2021.9588947
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Bitcoin's Prediction Model Using GRU, RNN, and LSTM by Hyperparameter Optimization Grid Search and Random Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…The proposed model is compared by integrating different optimization algorithms such as the grid search and the random search. 27 The random selection of parameters in the random search approach results in high variance affecting the precision and recall of the model. It also suffers when handling high-dimensional data.…”
Section: Comparative Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed model is compared by integrating different optimization algorithms such as the grid search and the random search. 27 The random selection of parameters in the random search approach results in high variance affecting the precision and recall of the model. It also suffers when handling high-dimensional data.…”
Section: Comparative Resultsmentioning
confidence: 99%
“…The average precision and recall value of a total of three users in three rounds is computed and the results are presented in Table 4. The proposed model is compared by integrating different optimization algorithms such as the grid search and the random search 27 . The random selection of parameters in the random search approach results in high variance affecting the precision and recall of the model.…”
Section: Resultsmentioning
confidence: 99%
“…The essence of the GSA is the enumeration method, which is costly in terms of time spent when the objective function is more complex [ 34 ]. Although the RSA no longer tests all values within a parameter range, randomly selected sample points in the search range may ignore optimal values [ 35 ]. The BOA is one of the most popular methods for tuning hyperparameters in deep learning models [ 36 ].…”
Section: Methodsmentioning
confidence: 99%
“…This method aims to find the parameter combinations that will make the model perform best by systematically trying all possible combinations within a given set of hyperparameters. Especially in complex models, choosing the right hyperparameters can greatly affect the performance of the model (Buslim et al, 2021). In this study, this method was preferred to provide the best hyperparameter settings for the GRU model.…”
Section: Long and Short Term Memory (Lstm)mentioning
confidence: 99%