2018 International Conference on Intelligent Rail Transportation (ICIRT) 2018
DOI: 10.1109/icirt.2018.8641683
|View full text |Cite
|
Sign up to set email alerts
|

Short-term forecasting of rail transit passenger flow based on long short-term memory neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 19 publications
0
14
0
Order By: Relevance
“…LSTM consist of five main components: memory block, memory cells, input gate, output gate, and forget gate. The whole computation can be defined by a series of equations as follows [39]: itbadbreak=σ()WiH+bi$$\begin{equation}{i_t} = \sigma \left( {{W^i}H + {b^i}} \right)\end{equation}$$ ftbadbreak=σ()WfH+bf$$\begin{equation}{f_t} = \sigma \left( {{W^f}H + {b^f}} \right)\end{equation}$$ otbadbreak=σ()WoH+bo$$\begin{equation}{o_t} = \sigma \left( {{W^o}H + {b^o}} \right)\end{equation}$$ ctbadbreak=prefixtanh()WcH+bc$$\begin{equation}{c_t} = \tanh \left( {{W^c}H + {b^c}} \right)\end{equation}$$ mtbadbreak=ft0.16em0.16emmt1goodbreak+it0.16em0.16emct$$\begin{equation}{m_t} = {f_t} \odot \,\,{m_{t - 1}} + {i_t} \odot \,\,{c_t}\end{equation}$$ htbadbreak=prefixtanh()otmt$$\begin{equation}{h_t} = \tanh \left( {{o_t} \odot {m_t}} \right)\end{equation}$$σis the sigmoid function, Wi${W^i}$,Wf${W^f}$,…”
Section: Methodsmentioning
confidence: 99%
“…LSTM consist of five main components: memory block, memory cells, input gate, output gate, and forget gate. The whole computation can be defined by a series of equations as follows [39]: itbadbreak=σ()WiH+bi$$\begin{equation}{i_t} = \sigma \left( {{W^i}H + {b^i}} \right)\end{equation}$$ ftbadbreak=σ()WfH+bf$$\begin{equation}{f_t} = \sigma \left( {{W^f}H + {b^f}} \right)\end{equation}$$ otbadbreak=σ()WoH+bo$$\begin{equation}{o_t} = \sigma \left( {{W^o}H + {b^o}} \right)\end{equation}$$ ctbadbreak=prefixtanh()WcH+bc$$\begin{equation}{c_t} = \tanh \left( {{W^c}H + {b^c}} \right)\end{equation}$$ mtbadbreak=ft0.16em0.16emmt1goodbreak+it0.16em0.16emct$$\begin{equation}{m_t} = {f_t} \odot \,\,{m_{t - 1}} + {i_t} \odot \,\,{c_t}\end{equation}$$ htbadbreak=prefixtanh()otmt$$\begin{equation}{h_t} = \tanh \left( {{o_t} \odot {m_t}} \right)\end{equation}$$σis the sigmoid function, Wi${W^i}$,Wf${W^f}$,…”
Section: Methodsmentioning
confidence: 99%
“…This type of RNN contains an input layer, a recurrent hidden layer, and an output layer, with a memory block structure as shown in Figure 3 [ 70 ].…”
Section: Fundamentals and Concepts Of Machine Learning And Deep Learn...mentioning
confidence: 99%
“…The LSTM memory block can be described according to the following equations [ 70 ]: where x t is the model input at time t ; W i , W f , W c , W 0 , U i , U f , U c , U 0 , V 0 are weight matrices; b i , b f , b c , b 0 are bias vectors; i t , f t , 0 t are respectively the activations of the three gates at time t ; c t is the state of memory cell at time t ; h t is the output of the memory block at time t ; ⊙ represents the scalar product of two vectors; is the gate activation function; g(x) is the cell input activation function; h(x) is the cell output activation function.…”
Section: Fundamentals and Concepts Of Machine Learning And Deep Learn...mentioning
confidence: 99%
“…In recent years, nonlinear models represented by neural networks have developed rapidly and have been widely used in pattern recognition, sample classification, prediction and other fields. For example, S. Sha [8] proposed a subway passenger flow prediction method based on RNN, but with the With the continuous improvement of relevant theoretical knowledge, we found that RNN cannot handle the problem of long-distance dependence very well; in 2018, Yuan Liu and Yong Qin [9] applied the LSTM model to short-term passenger flow prediction to solve the problem of RNN gradient disappearance during training. question.…”
Section: Introductionmentioning
confidence: 99%