2020
DOI: 10.1007/s12652-020-02167-9
|View full text |Cite
|
Sign up to set email alerts
|

Attack classification using feature selection techniques: a comparative study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
36
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 100 publications
(38 citation statements)
references
References 47 publications
0
36
0
2
Order By: Relevance
“…[44] in the case wherein these datasets are unbalanced. To address imbalance in training data, researchers focus on feature selection specifically, as in [43,45,46]. Many research groups have followed the success of machine learning, integrating machine learning into NIDS to leverage attack detection quality.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…[44] in the case wherein these datasets are unbalanced. To address imbalance in training data, researchers focus on feature selection specifically, as in [43,45,46]. Many research groups have followed the success of machine learning, integrating machine learning into NIDS to leverage attack detection quality.…”
Section: Related Workmentioning
confidence: 99%
“…The authors in [45] have made comparisons between several machine learning and deep learning algorithms, including decision tree, random forest, K-nearest neighbors, logistic regression, naïve Bayes, support vector machine, and ANN. However, the authors focus more on the feature selection problem, so the above classification algorithms have not been evaluated thoroughly.…”
Section: Related Workmentioning
confidence: 99%
“…In this study, the L1-norm SVM algorithm was used for the statistical and threshold-based feature selection algorithm since it provided high classification performance in many existing methods [25][26][27]. The Chi-square is the most used correlation-based feature selection algorithm [28][29][30][31][32][33]. The features that were obtained from both algorithms were stacked in a vector.…”
Section: Multi-level Feature Selectionmentioning
confidence: 99%
“…Feature selection is a method of selecting relevant features of a dataset by obtaining a subset from the original feature set based on specific criteria. Data dimensionality reduction is often applied to high-dimensional complex data [ 8 ]. Unlike feature extraction, feature selection preserves the physical meaning of the original features by retaining some of the data, and thus makes the model more readable and interpretable [ 9 , 10 ].…”
Section: Background and Related Workmentioning
confidence: 99%