2021
DOI: 10.1080/09720529.2021.2009189
|View full text |Cite
|
Sign up to set email alerts
|

Multi-class SVM based network intrusion detection with attribute selection using infinite feature selection technique

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 17 publications
0
1
0
Order By: Relevance
“…As listed in Table 3, we compare the proposed model with classic machine learning classifiers and deep learning models on testing set on multi-class scenario. The experimental results show that the key indicators of RF 18 , SVM 19 and KNN 20 are significantly lower than the deep learning models. But their detection time is faster than deep learning models, since they have only a few parameters for the purpose of classification.…”
Section: Performance Comparison Experimentsmentioning
confidence: 98%
See 2 more Smart Citations
“…As listed in Table 3, we compare the proposed model with classic machine learning classifiers and deep learning models on testing set on multi-class scenario. The experimental results show that the key indicators of RF 18 , SVM 19 and KNN 20 are significantly lower than the deep learning models. But their detection time is faster than deep learning models, since they have only a few parameters for the purpose of classification.…”
Section: Performance Comparison Experimentsmentioning
confidence: 98%
“…During comparison experiments, we compare the proposed MSCBL-ADN model with RF 18 , SVM 19 and KNN 20 shallow classifiers on testing set. We also compare it with CNN-based models (CNN-BMECapSA-RF 25 , LRDADF 27 ), LSTM-based models (VLSTM 30 ), and CNN-LSTM-based models (AsyncFL-bLAM 16 , NIDS-CNN-LSTM 33 ) during training and testing phase.…”
Section: Performance Comparison Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…This algorithm was chosen because of its high scalability for large data sets [18] with a large number of features as in this case and because it has proven its excellent performance for IDS [35]. The computing complexity of IFS is O(n 3 × T) [36], where T is the number of samples, and n is the number of initial features.…”
Section: Infinite Feature Selection Algorithmmentioning
confidence: 99%
“…The most attractive merit of Inf-FS lies that it assesses the importance of a given feature while taking all the possible subsets of features into consideration. Consequently, the Inf-FS algorithm has been applied to a wide variety of field for feature selection, and some good results have been achieved (Kaushik et al, 2021; Shao et al, 2018; Zhu et al, 2019). In this paper, the Inf-FS approach is employed to select the most informative fault features which are fed into the DCFS model for the intelligent fault diagnosis of rolling bearings.…”
Section: Introductionmentioning
confidence: 99%