2021
DOI: 10.1155/2021/6689204
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid LSTM Self‐Attention Mechanism Model for Forecasting the Reform of Scientific Research in Morocco

Abstract: Education is the cultivation of people to promote and guarantee the development of society. Education reforms can play a vital role in the development of a country. However, it is crucial to continually monitor the educational model’s performance by forecasting the outcome’s progress. Machine learning-based models are currently a hot topic in improving the forecasting research area. Forecasting models can help to analyse the impact of future outcomes by showing yearly trends. For this study, we developed a hyb… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0
1

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(4 citation statements)
references
References 42 publications
(31 reference statements)
0
3
0
1
Order By: Relevance
“…LSTM networks have been used to forecast stock prices, workload in cloud data centre, and power demand. For example, (Fahim et al, 2021) developed a forecasting algorithm called SAM-LSTM, which is a fusion method of self-attention mechanism (SAM) and LSTM, to forecast the reform of scientific research in Morocco. (Qiu et al, 2020) added investor sentiment tendency in model analysis and introduced empirical modal decomposition (EMD) combined with LSTM to obtain more accurate stock forecasts.…”
Section: Long Short Term Memory (Lstm)mentioning
confidence: 99%
“…LSTM networks have been used to forecast stock prices, workload in cloud data centre, and power demand. For example, (Fahim et al, 2021) developed a forecasting algorithm called SAM-LSTM, which is a fusion method of self-attention mechanism (SAM) and LSTM, to forecast the reform of scientific research in Morocco. (Qiu et al, 2020) added investor sentiment tendency in model analysis and introduced empirical modal decomposition (EMD) combined with LSTM to obtain more accurate stock forecasts.…”
Section: Long Short Term Memory (Lstm)mentioning
confidence: 99%
“…Briefly, Transformer is an architecture for transforming one sequence to another with the help of attention-based encoder and decoder. The attention mechanism can be used to look at the input sequence and decide at each step how important the rest of the sequence is, thus helping to capture global information from the input sequence [7]. Transformer has replaced recurrent neural networks in many sequential tasks (natural language processing, speech processing and computer vision) and is gradually being extended to handle non-sequential problems.…”
Section: Introductionmentioning
confidence: 99%
“…For the detection of Internet rumors, there has been some research at home and abroad, which can be divided into supervised and unsupervised based methods and deep learning-based detection methods [11,12]. For example, Arp et al in Ref.…”
Section: Introductionmentioning
confidence: 99%