2022
DOI: 10.1007/978-3-031-20429-6_2
|View full text |Cite
|
Sign up to set email alerts
|

A Review of Long Short-Term Memory Approach for Time Series Analysis and Forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…To address these limitations, the LSTM algorithm can be used, as it can capture both non-linearity and sequential information in the data. The LSTM is a powerful time series model that is derived from the recurrent neural network (RNN) structure [11,12]. Unlike RNNs, LSTM can memorize long-term sequential information by using extra memory lines and control gates, making it a suitable algorithm for time series problems such as stock prediction [13].…”
Section: Related Workmentioning
confidence: 99%
“…To address these limitations, the LSTM algorithm can be used, as it can capture both non-linearity and sequential information in the data. The LSTM is a powerful time series model that is derived from the recurrent neural network (RNN) structure [11,12]. Unlike RNNs, LSTM can memorize long-term sequential information by using extra memory lines and control gates, making it a suitable algorithm for time series problems such as stock prediction [13].…”
Section: Related Workmentioning
confidence: 99%
“…The primary contrast between recurrent neural networks (RNNs) and LSTMs is that LSTMs are an advancement of RNNs that enhances their memory. LSTMs exhibit superior performance in retaining long-term dependencies in the data compared to RNNs, accomplished by utilizing special memory cells and gate functions [6].…”
Section: Introductionmentioning
confidence: 99%
“…The primary contrast between recurrent neural networks (RNNs) and LSTMs is that LSTMs are an advancement of RNNs that enhances their memory. LSTMs exhibit superior performance in retaining long-term dependencies in the data compared to RNNs, accomplished by utilizing special memory cells and gate functions Ab Kader (2023).…”
mentioning
confidence: 99%