2023
DOI: 10.3390/mi14010149
|View full text |Cite
|
Sign up to set email alerts
|

Architecture Optimization of a Non-Linear Autoregressive Neural Networks for Mackey-Glass Time Series Prediction Using Discrete Mycorrhiza Optimization Algorithm

Abstract: Recurrent Neural Networks (RNN) are basically used for applications with time series and sequential data and are currently being used in embedded devices. However, one of their drawbacks is that RNNs have a high computational cost and require the use of a significant amount of memory space. Therefore, computer equipment with a large processing capacity and memory is required. In this article, we experiment with Nonlinear Autoregressive Neural Networks (NARNN), which are a type of RNN, and we use the Discrete M… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 82 publications
0
3
0
Order By: Relevance
“…We have previously applied the DMOA algorithm in the optimization of the architecture of a non-linear autoregressive neural network for Mackey-Glass time series prediction [69], and in this article in the adaptation of the parameters of the fuzzy systems T1FLS and IT2FLS. Additionally, hypothesis tests were carried out and the results obtained in both investigations were favorable for the DMOA optimization algorithm.…”
Section: Discussionmentioning
confidence: 99%
“…We have previously applied the DMOA algorithm in the optimization of the architecture of a non-linear autoregressive neural network for Mackey-Glass time series prediction [69], and in this article in the adaptation of the parameters of the fuzzy systems T1FLS and IT2FLS. Additionally, hypothesis tests were carried out and the results obtained in both investigations were favorable for the DMOA optimization algorithm.…”
Section: Discussionmentioning
confidence: 99%
“…The models were trained for 20 epochs, and the best-performing show on the approval set was chosen for assessment on the test set. For reasonable comparison, the same exploratory setup was connected to the Vanilla RNN, GRU, Attention Mechanism, and Transformer-based models [11]. Hyperparameters, such as covered-up measure, input measure, and learning rate, were tuned based on approval set execution.…”
Section: Transformer-based Modelmentioning
confidence: 99%
“…Although they possess distinct advantages and challenges, both methods offer significant benefits in time series forecasting. Researchers worldwide are developing diverse artificial intelligence (AI) models to meet the growing need to predict future events accurately [5]. Neural Networks (NNs) and Fuzzy Systems (FS) are two types of artificial intelligence models that are increasingly utilized to make precise and dependable predictions in time series forecasting.…”
Section: Introductionmentioning
confidence: 99%