2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC) 2020
DOI: 10.1109/fmec49853.2020.9144972
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Economic-Denial-of-Sustainability (EDoS) Detection in SDN-based Cloud

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…Researchers in [5] proposed an approach to mitigate EDoS attacks in the SDN-based cloud computing environment. An unsupervised deep learning technique called long short-term memory (LSTM) was used as a multivariate time series anomaly detection model.…”
Section: Solutions At Application Levelmentioning
confidence: 99%
See 1 more Smart Citation
“…Researchers in [5] proposed an approach to mitigate EDoS attacks in the SDN-based cloud computing environment. An unsupervised deep learning technique called long short-term memory (LSTM) was used as a multivariate time series anomaly detection model.…”
Section: Solutions At Application Levelmentioning
confidence: 99%
“…In the post-COVID-19 world, it is clear that more people and businesses are adopting cloud services, software, and infrastructure, as they can be accessed anytime, and from anywhere. To handle security risks, several research works and developments, such as in [5][6][7][8], have been proposed. Nonetheless, there are still more opportunities for new techniques to make the cloud more secure.…”
Section: Introductionmentioning
confidence: 99%
“…LSTM is another recurrent model that can address the RNN memory problem. In previous studies [12,20,21], LSTM showed major improvements over what RNNs could accomplish. LSTM is designed to avoid long dependency problems and can remember long historical information and gain high accuracy in EDoS detection with a sequence flow-based method.…”
Section: Related Workmentioning
confidence: 99%
“…In this section, the workflow of the scheme is discussed, including: data capture, data preprocessing and model architecture as shown in Figure 6. Referring to a previous study [21], the sequence length is 250 packets per flow. To overcome the loss of information and memory vanishing of LSTM, a longer sequence length of the window slot, that is 500 packets per flow sequence, is proposed.…”
Section: Preprocessing and Model Work Flowmentioning
confidence: 99%
See 1 more Smart Citation