2019
DOI: 10.5815/ijcnis.2019.07.06
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Feature Selection and Classification of Internet Traffic using XGBoost Classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(14 citation statements)
references
References 30 publications
0
10
0
1
Order By: Relevance
“…The proposed model is evaluated by the detection rate and the false positive rate. We select the state of the art six methods for the comparison of malicious traffic detection, two of which are detection models based on machine learning, and the other four are traffic detection models based on deep learning, special as follows: XGBoost 28 : The model is biased towards classes with more samples, and because the features of the minority class are usually regarded as noise and therefore ignored, they also tend to predict the problem of only the majority class data. An integrated classification model of XGBoost combined with a tree method is proposed to improve classification performance.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed model is evaluated by the detection rate and the false positive rate. We select the state of the art six methods for the comparison of malicious traffic detection, two of which are detection models based on machine learning, and the other four are traffic detection models based on deep learning, special as follows: XGBoost 28 : The model is biased towards classes with more samples, and because the features of the minority class are usually regarded as noise and therefore ignored, they also tend to predict the problem of only the majority class data. An integrated classification model of XGBoost combined with a tree method is proposed to improve classification performance.…”
Section: Resultsmentioning
confidence: 99%
“…XGBoost 28 : The model is biased towards classes with more samples, and because the features of the minority class are usually regarded as noise and therefore ignored, they also tend to predict the problem of only the majority class data. An integrated classification model of XGBoost combined with a tree method is proposed to improve classification performance.…”
Section: Resultsmentioning
confidence: 99%
“…The main advantage of XGBoost is its scalability and execution speed, which are far better than other machine learning algorithms [43]. Manju et al [44] noted that models have a bias towards classes with more samples. They also tend to predict only the majority class data as features of the minority class are often treated as noise and therefore ignored.…”
Section: Xgboost Modelmentioning
confidence: 99%
“…As a result, the authors proposed an ensemble classification model using XGboost to enhance the performance of classification. Compared with other models, XGBoost proved better classification accuracy when dealing with classification problems [44]. We used Python libraries to implement all XGBoost processes.…”
Section: Xgboost Modelmentioning
confidence: 99%
“…Saber et al [58] achieved traffic feature selection based on linear discriminant analysis. Manju et al [59] ranked traffic features according to the feature weights which were the number of times that each feature appears in the tree, and selected the optimal feature subset based on the accuracies of the extreme gradient boosting model. Jamil et al [60] created several candidate feature subsets by different feature selection algorithms, and chose the best subset according to the results of all the feature subsets' evaluations based on the five ML algorithms.…”
Section: Related Workmentioning
confidence: 99%