2020
DOI: 10.1109/access.2020.3021527
|View full text |Cite
|
Sign up to set email alerts
|

Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer

Abstract: In this research, we propose an enhanced Grey Wolf Optimizer (GWO) for designing the evolving Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) networks for time series analysis. To overcome the probability of stagnation at local optima and a slow convergence rate of the classical GWO algorithm, the newly proposed variant incorporates four distinctive search mechanisms. They comprise a nonlinear exploration scheme for dynamic search territory adjustment, a chaotic leadership dispatching strategy a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
41
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 111 publications
(43 citation statements)
references
References 91 publications
(139 reference statements)
1
41
0
1
Order By: Relevance
“…Furthermore, the aforementioned A-BiLSTM architecture implemented in this research was shown to be highly effective, but with further experimentation with different layer and hyperparameter settings [24][25][26][27][28][29][30][31][32], additional improvements in performance could be made. Evolutionary algorithms [33][34][35][36][37][38][39][40][41][42][43][44][45][46][47][48][49] could also be exploited pertaining to the above parameter tuning as well as architecture generation processes. Moreover, it would also be beneficial to employ additional medical audio datasets to further evaluate model efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, the aforementioned A-BiLSTM architecture implemented in this research was shown to be highly effective, but with further experimentation with different layer and hyperparameter settings [24][25][26][27][28][29][30][31][32], additional improvements in performance could be made. Evolutionary algorithms [33][34][35][36][37][38][39][40][41][42][43][44][45][46][47][48][49] could also be exploited pertaining to the above parameter tuning as well as architecture generation processes. Moreover, it would also be beneficial to employ additional medical audio datasets to further evaluate model efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…GWO has recently been used for other exploration problems. For instance, in [23], the authors used the GWO algorithm for evolving Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) networks for time series analysis. They showed that GWO can produce significantly better results than other meta-heuristic methods.…”
Section: Other Related Workmentioning
confidence: 99%
“…The final results of trial and error method shows that three convolutional layers with a kernel size equal to 1 × 1 associated with three successive BiLSTM layers achieve better results according the highest the R 2 criteria. For the rest of optimization work, ES is employed to tune the critical hyperparameters of the proposed computing framework, specifically, the batch size, the number of units, the number of training epochs [49,50]. To tune these hyperparameters, Hyperactive library is employed on open source [39].…”
Section: Model Construction and Parameter Sittingsmentioning
confidence: 99%