2022
DOI: 10.3390/app12052305
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Gated Recurrent Unit Neural Network for State-of-Charge Estimation of Lithium-Ion Battery

Abstract: State-of-charge (SOC) estimation of lithium-ion battery is a key parameter of the battery management system (BMS). However, SOC cannot be obtained directly. In order to predict SOC accurately, we proposed a recurrent neural network called gated recurrent unit network that is based on genetic algorithm (GA-GRU) in this paper. GA was introduced to optimize the key parameters of the model, which can improve the performance of the proposed network. Furthermore, batteries were tested under four dynamic driving cond… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…However, due to the heavy mathematical processing load of the LSTM method, the GRU method has emerged as a simpler method. The GRU method is one of the most used methods in battery SOC estimation as it consists of fewer gates and performs faster processing [18]. Additionally, the BiLSTM method, which is used in studies to estimate battery SOC, is based on the LSTM method and transfers information in both directions [19].…”
Section: Related Workmentioning
confidence: 99%
“…However, due to the heavy mathematical processing load of the LSTM method, the GRU method has emerged as a simpler method. The GRU method is one of the most used methods in battery SOC estimation as it consists of fewer gates and performs faster processing [18]. Additionally, the BiLSTM method, which is used in studies to estimate battery SOC, is based on the LSTM method and transfers information in both directions [19].…”
Section: Related Workmentioning
confidence: 99%
“…Unlike other neural networks, the RNN network consists of some copies of a unique unit that are connected like a chain, this structure allows Simple RNN network suffers from gradient vanishing and gradient explosion, in order to solve these problems LSTM and GRU are established [11,12]. In compression with LSTM, the GRU cell has a simpler structure due to one less gate [28]. Therefore, the number of parameters under the same network structure in GRU is less than in LSTM, which can reduce the risk of overfitting and boost the convergence rate while its performance is just like LSTM cell [18].…”
Section: Gru Algorithmmentioning
confidence: 99%
“…4: a recurrent neural network schematic GRU architecture is shown in Fig. 4, in compression with simple RNN, the GRU-RNN has a relevance gate which is responsible for controlling the retention of historical information, and an update gate which controls the effect of the previous cell state and the current input on the new cell state [28]. information from the current input and Information from the previous hidden state which can be considered as the memory of the network is passed through the sigmoid function, so the Values come out between 0 and 1.…”
Section: Gru Algorithmmentioning
confidence: 99%
“…The authors used GA to optimize the number of layers and neurons in the GRU model structure. The results showed that this method is highly accurate and robust [32]. Zhang et al combined GA with a fuzzy logic control neural network algorithm, utilizing the neural network to accurately estimate the dynamic SOC of the battery based on the static SOC.…”
Section: Introductionmentioning
confidence: 99%