2018
DOI: 10.3390/info9070149
|View full text |Cite
|
Sign up to set email alerts
|

Effective Intrusion Detection System Using XGBoost

Abstract: Abstract:As the world is on the verge of venturing into fifth-generation communication technology and embracing concepts such as virtualization and cloudification, the most crucial aspect remains "security", as more and more data get attached to the internet. This paper reflects a model designed to measure the various parameters of data in a network such as accuracy, precision, confusion matrix, and others. XGBoost is employed on the NSL-KDD (network socket layer-knowledge discovery in databases) dataset to ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
133
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 297 publications
(138 citation statements)
references
References 17 publications
2
133
0
3
Order By: Relevance
“…gradient boosting models, its implementation of parallel processing enables a fast model training compared to many traditional models, and can be deployed to high-performance platform for large-scale parallel computing. The technique was found to outperform other machine learning and deep learning techniques in many competitions such as Kaggle and KDDCup (Chen and Guestrin, 2016;Dhaliwal, et al, 2018), especially for datasets with sparse matrix. It has been successfully applied in many bioinformatic studies, such as miRNA-disease association (Chen, et al, 2018), protein translocation (Mendik, et al, 2019), protein-protein interactions (Basit, et al, 2018), and DNA methylation (Zou, et al, 2018).…”
mentioning
confidence: 98%
“…gradient boosting models, its implementation of parallel processing enables a fast model training compared to many traditional models, and can be deployed to high-performance platform for large-scale parallel computing. The technique was found to outperform other machine learning and deep learning techniques in many competitions such as Kaggle and KDDCup (Chen and Guestrin, 2016;Dhaliwal, et al, 2018), especially for datasets with sparse matrix. It has been successfully applied in many bioinformatic studies, such as miRNA-disease association (Chen, et al, 2018), protein translocation (Mendik, et al, 2019), protein-protein interactions (Basit, et al, 2018), and DNA methylation (Zou, et al, 2018).…”
mentioning
confidence: 98%
“…Classification models were built using a powerful tree boosting algorithm, XGBoost, which in 2015 was used in every winning team in the top 10 of the Data Mining and Knowledge Discovery competition for a wide range of machine learning problems [19] and was suggested to be one of the most sophisticated methods at the time of this work [20], [21]. Furthermore, the tree learning algorithm uses parallel and distributed computing and is approximately 10 times faster than existing methods and allows many hyperparameters to be tuned to reduce the chance of overfitting [20].…”
Section: Methodsmentioning
confidence: 99%
“…Various methods have been proposed in the literature for network anomaly detection including standard machine learning classifiers 4–29 and deep learning techniques 30–47 . Muda et al performed clustering before classification and compared the single classifiers with hybrid classifiers.…”
Section: Related Workmentioning
confidence: 99%
“…Mirza developed an ensemble method that combines logistic regression, neural networks, and decision trees and obtained 96.14% detection rate using KDDCup99 dataset 25 . Dhaliwal et al developed several XGBoost models and obtained 98.70% accuracy on NSL‐KDD dataset 26 . Dahiya and Srivastava compared two dimension reduction algorithms such as canonical correlation analysis and linear discriminant analysis using several classification algorithms and obtained at most 95.53% accuracy rate on UNSW‐NB dataset using canonical correlation analysis with bagging 27 .…”
Section: Related Workmentioning
confidence: 99%