2019
DOI: 10.1109/tcsii.2019.2924663
|View full text |Cite
|
Sign up to set email alerts
|

Economic LSTM Approach for Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 59 publications
(15 citation statements)
references
References 13 publications
0
15
0
Order By: Relevance
“…The GAPG method is composed of two networks. The irst is a recurrent neural network (RNN) that takes long short-term memory (LSTM) [39] as the concrete implementation. This network acts as a generator G that receives a set of veriication properties as its initial input; then, it tries to generate veriication properties that are suiciently similar to the input data.…”
Section: Cgan the Original Gan Can Perform Only Generation Based On R...mentioning
confidence: 99%
“…The GAPG method is composed of two networks. The irst is a recurrent neural network (RNN) that takes long short-term memory (LSTM) [39] as the concrete implementation. This network acts as a generator G that receives a set of veriication properties as its initial input; then, it tries to generate veriication properties that are suiciently similar to the input data.…”
Section: Cgan the Original Gan Can Perform Only Generation Based On R...mentioning
confidence: 99%
“…Similar to GRU, using a more compact recursion formula to reduce the number of gates can simplify the networks and have comparable performance with LSTM. For example, Khalil et al [23] combine the forget gate G fg , input gate G in , and output gate G ot into one gate, and let the combined gate and the tanh gate to include the internal state α t−1 as the peephole connections. The proposed network is, thus, requiring fewer hardware units to perform its functionality.…”
Section: Modified Peephole Lstmmentioning
confidence: 99%
“…The RNN has been widely used to handle the longitudinal variables, LSTM is one type of RNN (24,25). It can effectively process a large amount of sequential data.…”
Section: Recurrent Neural Network-long Short-term Memorymentioning
confidence: 99%