2016
DOI: 10.1063/1.4958527
|View full text |Cite
|
Sign up to set email alerts
|

Backpropagation neural network models for LiFePO4 battery

Abstract: Abstract. Neural Networks have been used in system control, medicine, pattern recognition and business. The backpropagation neural network (BPNN) appear to be most popular and have been widely used in many applications. BPNN is a supervised learning technique for training multilayer feedforward neural networks. The gradient or steepest descent method is used to train a BPNN by adjusting the weights . The purpose of update numerical weights is minimize error of network between target and output. In this paper, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…Thus, the multilayer perceptron neural network trained with BPNN is the most popular and widely used network paradigm employed by engineering applications to solve practical problems, and it has demonstrated exceptional performance [53,54,56,59,60]. Inevitably, the traditional BPNN algorithm has some shortcomings, such as low convergence speed and an easy fall to the local minimum, but some remedies have been proposed to solve these problems [61].…”
Section: Methodsmentioning
confidence: 99%
“…Thus, the multilayer perceptron neural network trained with BPNN is the most popular and widely used network paradigm employed by engineering applications to solve practical problems, and it has demonstrated exceptional performance [53,54,56,59,60]. Inevitably, the traditional BPNN algorithm has some shortcomings, such as low convergence speed and an easy fall to the local minimum, but some remedies have been proposed to solve these problems [61].…”
Section: Methodsmentioning
confidence: 99%
“…The former transmits the output values layer by layer, while the latter sums the error derivatives for weights in the reverse direction until all the data are run through the network once (Dong et al, 2020). This constitutes an epoch, and the weights are updated after each epoch such that the model error decreases (Primadusi et al, 2016).…”
Section: Bpnnmentioning
confidence: 99%
“…This partitioning is unique for each tree in the forest and hence provides a significant internal validation. As a result, RF can overcome the disadvantages of overfitting and instability and has good robustness and high interpretability (Khoshgoftaar et al, 2007;Primadusi et al, 2016). More specific details of RF algorithm can be found in Breiman' article (Breiman, 2001).…”
Section: Rfmentioning
confidence: 99%
“…A typical BPNN has a simple network topology consisting of three layers: input layer, hidden layer, and output layer as shown in Figure 11. The learning process is divided into two phases: the forward signal propagation phase and the reverse error propagation phase [48].…”
Section: Chaptermentioning
confidence: 99%