2016
DOI: 10.13052/jcs2445-9992.2016.001
|View full text |Cite
|
Sign up to set email alerts
|

A Feature Selection Approach Based on Simulated Annealing for Detecting Various Denial of Service Attacks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…In [16], four undersampling techniques based on neighbourhood searching (NB-based) by utilizing the k-NN rule to select and remove majority class examples from the potential region of overlapping. RF (Random Forest) and SVM (Support Vector Machine) were used in [16] for learning and their results compared with several pre-processing state-of-the-art techniques to rebalance datasets before applying the learning algorithm, like the SMOTE (Synthetic Minority Over-Sampling Technique) [40], kmUnder (k -means undersampling) [29], OBU [41], BLSMOTE [42] and ENN [43]. Columns "NB-SVM" and "NB-RF" in Tables 2 and 3 present the best G-mean and F-score values, respectively, selected for each classifier (SVM and RF) among the four (NB-based) methods, while posterior columns present the best value selected from the classifiers (SVM and RF) after applying the state-of-theart pre-processing techniques.Bold values in Tables 2 and 3 represents the best value in that row while Italic represnts the second and third best values in the same row.…”
Section: Resultsmentioning
confidence: 99%
“…In [16], four undersampling techniques based on neighbourhood searching (NB-based) by utilizing the k-NN rule to select and remove majority class examples from the potential region of overlapping. RF (Random Forest) and SVM (Support Vector Machine) were used in [16] for learning and their results compared with several pre-processing state-of-the-art techniques to rebalance datasets before applying the learning algorithm, like the SMOTE (Synthetic Minority Over-Sampling Technique) [40], kmUnder (k -means undersampling) [29], OBU [41], BLSMOTE [42] and ENN [43]. Columns "NB-SVM" and "NB-RF" in Tables 2 and 3 present the best G-mean and F-score values, respectively, selected for each classifier (SVM and RF) among the four (NB-based) methods, while posterior columns present the best value selected from the classifiers (SVM and RF) after applying the state-of-theart pre-processing techniques.Bold values in Tables 2 and 3 represents the best value in that row while Italic represnts the second and third best values in the same row.…”
Section: Resultsmentioning
confidence: 99%
“…Boltzmann probability, P = e − Φ T is applied as an acceptance condition of the neighbour solution. Φ refers to the difference between the fitness of the optimal and neighbour solutions, and T is named temperature which is gradually reduced based on cooling schedule throughout the search procedure [20,23,25]. In this paper, as adopted in [25], the initial temperature equals 2 * |N| , where N is the number of features in each data set, and T = 0.93 * T , is applied to calculate the cooling schedule.…”
Section: Simulated Annealingmentioning
confidence: 99%
“…Considering data reduction with respect to the set of features to be used for model training, Jeong et al (2016) propose a feature selection approach based on simulated annealing and apply it to a case study for detecting denial of service attacks [90]. This approach is similar to [91], which uses the same data set but a different local search algorithm.…”
Section: Framework For Data Reductionmentioning
confidence: 99%