2022
DOI: 10.1016/j.asoc.2021.108375
|View full text |Cite
|
Sign up to set email alerts
|

An automatic selection of optimal recurrent neural network architecture for processes dynamics modelling purposes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…where: X i (t) is the input value at time t W ij (t) is the weight of neural input at time t b ij is the bias F is a transfer function y(t) is the output value at time t Recurrent neural networks (RNNs) are variants of neural networks that are good at dealing with sequential data processing [47]. The structure of the ANN is organized iteratively, such that output data are converted to input data taking into account the stored output of the previous time step t − 1, which is added to the inputs of the current time step t. This configuration means that a change in the state of an individual neuron can be transferred via feedback to the other neurons, invoking transient states and generally leading to another state of the network [48].…”
Section: Basic Concepts Of the Algorithms Used In This Studymentioning
confidence: 99%
“…where: X i (t) is the input value at time t W ij (t) is the weight of neural input at time t b ij is the bias F is a transfer function y(t) is the output value at time t Recurrent neural networks (RNNs) are variants of neural networks that are good at dealing with sequential data processing [47]. The structure of the ANN is organized iteratively, such that output data are converted to input data taking into account the stored output of the previous time step t − 1, which is added to the inputs of the current time step t. This configuration means that a change in the state of an individual neuron can be transferred via feedback to the other neurons, invoking transient states and generally leading to another state of the network [48].…”
Section: Basic Concepts Of the Algorithms Used In This Studymentioning
confidence: 99%
“…Figure 1 shows an illustrative example of the main concepts of NNs [19][20][21][22]. There are three layers in this example.…”
Section: Neural Optimization Machine (A) a Brief Introduction To Nnsmentioning
confidence: 99%
“…At the initial stage, the neuron receives output signals at its input, which then summarize and exceed the activation function of each of them, resulting in the output signal (Figure 1a). There is a multilayer frontier neural network of the most direct propagation [19] (Figure 1b), which is an oriented graph with one-way directed edges. A significant disadvantage of neural networks that one may consider is the solving tasks aimed at black box-type pattern recognition.…”
Section: Introductionmentioning
confidence: 99%