2022
DOI: 10.1016/j.compeleceng.2022.107823
|View full text |Cite
|
Sign up to set email alerts
|

Power management strategy based on Elman neural network for grid-connected photovoltaic-wind-battery hybrid system

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 22 publications
0
14
0
Order By: Relevance
“…The output of the implicit layer goes into the takeover layer, and the output of the takeover layer returns to the implicit layer, so as to achieve the memory function of the historical data, which makes the network have the ability to adapt to the time‐varying. Compared with the feed‐forward neural network, the ENN has more computational ability and more stability [24–27]. The specific expression of the ENN is as follows: rfalse(kfalse)=f(W2ufalse(kgoodbreak−1false)+W1rcfalse(kfalse)+b1)rcfalse(kfalse)=rfalse(kgoodbreak−1false)pfalse(kfalse)=g(W3rfalse(kfalse)+b2)$$\begin{equation} \def\eqcellsep{&}\begin{array}{l} r(k) = f({W}_2u(k - 1) + {W}_1{r}_c(k) + {b}_1)\\ {r}_c(k) = r(k - 1)\\ p(k) = g({W}_3r(k) + {b}_2) \end{array} \end{equation}$$where ufalse(k0.28em1false)$u(k\; - 1)$ is the input vector of the neural network, rfalse(kfalse)$r(k)$ is the vector of the hidden layer pfalse(kfalse)$p(k)$ and rc(k)${r}_c(k)$ is the vector of the output vector and the vector of the undertaking layer.…”
Section: Detection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The output of the implicit layer goes into the takeover layer, and the output of the takeover layer returns to the implicit layer, so as to achieve the memory function of the historical data, which makes the network have the ability to adapt to the time‐varying. Compared with the feed‐forward neural network, the ENN has more computational ability and more stability [24–27]. The specific expression of the ENN is as follows: rfalse(kfalse)=f(W2ufalse(kgoodbreak−1false)+W1rcfalse(kfalse)+b1)rcfalse(kfalse)=rfalse(kgoodbreak−1false)pfalse(kfalse)=g(W3rfalse(kfalse)+b2)$$\begin{equation} \def\eqcellsep{&}\begin{array}{l} r(k) = f({W}_2u(k - 1) + {W}_1{r}_c(k) + {b}_1)\\ {r}_c(k) = r(k - 1)\\ p(k) = g({W}_3r(k) + {b}_2) \end{array} \end{equation}$$where ufalse(k0.28em1false)$u(k\; - 1)$ is the input vector of the neural network, rfalse(kfalse)$r(k)$ is the vector of the hidden layer pfalse(kfalse)$p(k)$ and rc(k)${r}_c(k)$ is the vector of the output vector and the vector of the undertaking layer.…”
Section: Detection Methodsmentioning
confidence: 99%
“…The output of the implicit layer goes into the takeover layer, and the output of the takeover layer returns to the implicit layer, so as to achieve the memory function of the historical data, which makes the network have the ability to adapt to the time-varying. Compared with the feed-forward neural network, the ENN has more computational ability and more stability [24][25][26][27]. The specific expression of the ENN is as follows:…”
Section: Elman Neural Network Prediction Modelmentioning
confidence: 99%
“…ENN is a recursive network featured by an internal self-referencing layer. ENN consists of four components: input layer, context layer, hidden layer, and output layer ( Boualem et al, 2022 ). The context layer is designed to store or memorize the output values before the hidden layer.…”
Section: Methods and Datamentioning
confidence: 99%
“…The nonlinear expression of the ENN is as follows ( Boualem et al, 2022 ; Ruiz et al, 2018 ): where k is the training number of ENN; y denotes the n-dimensional output vector; x represents the hidden layer neuron output vector; x c denotes the feedback state vector; u means the input vector; g represents the transfer function of the output neuron; and is the connection weight matrices, and f represents the transfer function of the hidden layer neuron.…”
Section: Methods and Datamentioning
confidence: 99%
See 1 more Smart Citation