2019
DOI: 10.1016/j.applthermaleng.2019.114072
|View full text |Cite
|
Sign up to set email alerts
|

Production capacity analysis and energy saving of complex chemical processes using LSTM based on attention mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 54 publications
(16 citation statements)
references
References 21 publications
0
16
0
Order By: Relevance
“…It was clear that the LSTM-NN model had the highest prediction accuracy among all models. Similarly, Han et al [19] established a model in production prediction and energy saving of complex chemical processes using LSTM and the results also showed that the average relative error results of the LSTM were obviously better than the BP. In addition, MSE and RMSE which served to assess the differences between the expected and actual values showed the same result.…”
Section: Comparison Between Bp-nn and Lstm-nnmentioning
confidence: 98%
See 3 more Smart Citations
“…It was clear that the LSTM-NN model had the highest prediction accuracy among all models. Similarly, Han et al [19] established a model in production prediction and energy saving of complex chemical processes using LSTM and the results also showed that the average relative error results of the LSTM were obviously better than the BP. In addition, MSE and RMSE which served to assess the differences between the expected and actual values showed the same result.…”
Section: Comparison Between Bp-nn and Lstm-nnmentioning
confidence: 98%
“…[18] However, the main difference between them is that BP-NN of knowledge obtained during the training phase is distributed in the whole network in the form of connection weights and thresholds between neurons, and the LSTM-NN also uses internal memory units to store knowledge. [19,29] The structure ofLSTMblockis shown in Figure 2, configured mainlybyinput gate (i), forget gate (f), output gate (o) and memory cell state. [12] Specifically, memory cell state as the crucial element runs down through the entire chain to solve the problem of gradient vanishing and explosion, which traditional recurrent neural network (RNN) encounters and stores the state information at a time.…”
Section: Predictive Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, according to Wang et al [22], BP-NN has the weak multioutput ability and self-learning ability. LSTM-NN, as a new neural network model, is used in lithium battery lifetime prediction [23], complex chemical process yield prediction [24], protein structure prediction [25], but applying LSTM-NN to aquatic product shelf-life prediction is almost not reported.…”
Section: Introductionmentioning
confidence: 99%