2020
DOI: 10.1016/j.neucom.2019.10.084
|View full text |Cite
|
Sign up to set email alerts
|

A non-parametric softmax for improving neural attention in time-series forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 36 publications
0
18
0
Order By: Relevance
“…Subsequently, scholars have further refined it under its structures. Toraro et al 47 study a learnable softmax operation to replace the regular softmax layer on top of DA‐RNN. Hu and Zheng 48 also improved on DA‐RNN with an LSTM model and a first‐stage attentional model as an encoder to make predictions about MTS data.…”
Section: Related Workmentioning
confidence: 99%
“…Subsequently, scholars have further refined it under its structures. Toraro et al 47 study a learnable softmax operation to replace the regular softmax layer on top of DA‐RNN. Hu and Zheng 48 also improved on DA‐RNN with an LSTM model and a first‐stage attentional model as an encoder to make predictions about MTS data.…”
Section: Related Workmentioning
confidence: 99%
“…Neural networks are trained using Softmax as an activation function [31]. Softmax has the advantage of being computationally convenient and having a simple form of results for multiple classification problems.…”
Section: Softmax Activation Functionmentioning
confidence: 99%
“…This feature is particularly beneficial in time series prediction as it allows the model to weigh the importance of past and recent events differently, thereby significantly improving the accuracy of the predictions. The authors in (Totaro, Hussain and Scardapane, 2020) demonstrate the application of this technique in time series forecasting with the dual-stage attention-based recurrent neural network. This model is applied to the hourly data of Dogecoin price for its prediction over time.…”
Section: ) Related Workmentioning
confidence: 99%