2020
DOI: 10.1186/s13673-020-00242-w
|View full text |Cite
|
Sign up to set email alerts
|

Modelling email traffic workloads with RNN and LSTM models

Abstract: Analysis of time series data has been a challenging research subject for decades. Email traffic has recently been modelled as a time series function using a Recurrent Neural Network (RNN) and RNNs were shown to provide higher prediction accuracy than previous probabilistic models from the literature. Given the exponential rise of email workloads which need to be handled by email servers, in this paper we first present and discuss the literature on modelling email traffic. We then explain the advantages and lim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…However, as the input time series information is too long, the effect of the RNN network is significantly reduced, and its ability to process long-term sequences is weak. At the same time, as the input sequence grows, the depth of the model also increases, and RNNs experience gradient vanishing and gradient exploding problems during training [5] . To solve the issues of simple RNNs, the Long Short-Term Memory (LSTM) model was proposed.…”
Section: Lstm Modelmentioning
confidence: 99%
“…However, as the input time series information is too long, the effect of the RNN network is significantly reduced, and its ability to process long-term sequences is weak. At the same time, as the input sequence grows, the depth of the model also increases, and RNNs experience gradient vanishing and gradient exploding problems during training [5] . To solve the issues of simple RNNs, the Long Short-Term Memory (LSTM) model was proposed.…”
Section: Lstm Modelmentioning
confidence: 99%
“…ANNs are built primarily on the features extracted from input data. ANNs deal with the nonlinear relationships of time series data by learning from experience and building complex neural networks to generate optimal solutions for prediction problems [ 32 , 33 ]. ANN consists of input layer, zero or more hidden layers, and an output layer.…”
Section: Related Workmentioning
confidence: 99%
“…Since the RNN model stores the corresponding historical information by the number of network layers, the less the number of network layers, the more incomplete the historical information recorded. In addition, the more the number of network layers, the more complex the training process will be, and it is easy to have the phenomenon that the gradient descent speed is fast or even disappears, both of which will lead to poor prediction performance of the model [25].…”
Section: Rnn and Lstm Te Earliest Known Rnn Is Hopfeldmentioning
confidence: 99%