2020
DOI: 10.1002/spy2.134
|View full text |Cite
|
Sign up to set email alerts
|

On the use of generalized entropy formulas in detection of denial‐of‐service attacks

Abstract: In an earlier journal paper, we have presented the results of a comparison of the performance of five entropy‐based denial‐of‐service detectors (Shannon, Rényi, Tsallis, Bhatia‐Singh, and Ubriaco). The dataset used in that evaluation was generated using simulation. The best performance in terms of detection rate showed Rényi and Bhatia‐Singh detectors. In terms of the detection delay, the best detector was Ubriaco. In this paper, we decided to repeat the evaluation, but this time using a dataset generated usin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 22 publications
1
3
0
Order By: Relevance
“…al. [8] study various DoS attack detectors based on different entropy types using source side features and report that detectors based on Tsallis [11] -in agreement with our study -and Bhatia-Singh [25] entropies perform best. Compared to their approach, we use differential and statistical analysis to signal an attack which is hyper-parameter free and thus is robust.…”
Section: Resultssupporting
confidence: 82%
See 1 more Smart Citation
“…al. [8] study various DoS attack detectors based on different entropy types using source side features and report that detectors based on Tsallis [11] -in agreement with our study -and Bhatia-Singh [25] entropies perform best. Compared to their approach, we use differential and statistical analysis to signal an attack which is hyper-parameter free and thus is robust.…”
Section: Resultssupporting
confidence: 82%
“…There exists an abundance of research on the detection of DoS attacks [2] and [3]. There are many different approaches ranging from statistical approaches [4] [5] to machine and deep learning [6], [7] to information-theoretical ones [8] [9] [10]. Each type of these approaches has its own drawbacks.…”
Section: Introductionmentioning
confidence: 99%
“…Here ( X ) determines the entropy, H( X | Y ) is the entropy of “ X ” given the entropy of “ Y ,” P ( xi | yi ) is the posterior probability of “ X ” given the values of “ X ,” H( X ) is entropy of “ X .” IG ( X | Y ) is the information gain of “ X ” provided “ Y ” is given. For high entropy, the information gain component is highly variable across the instances 59 . Computing it means subtracting the conditional entropy from the entropy of the class label.…”
Section: Proposed Modelmentioning
confidence: 99%
“…For high entropy, the information gain component is highly variable across the instances. 59 Computing it means subtracting the conditional entropy from the entropy of the class label. It is a frequency-dependent feature selection method and thus related to the number of occurrences of an event.…”
Section: Chi-square Testmentioning
confidence: 99%