2021
DOI: 10.1007/978-981-16-3342-3_1
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning-Based Ensemble Network Security System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 7 publications
0
1
0
Order By: Relevance
“…However, the precision and recall values are not available. With an accuracy of 82.00%, an F1 score of 82.50%, a precision of 85.00%, and a recall of 85.00%, reference [35] introduced the DSSTE method. Reference [36] applied an NB approach and achieved a high accuracy of 97.14%, a stellar F1 score of 97.94%, and a precision of 96.72%.…”
Section: Comparative Analysismentioning
confidence: 99%
“…However, the precision and recall values are not available. With an accuracy of 82.00%, an F1 score of 82.50%, a precision of 85.00%, and a recall of 85.00%, reference [35] introduced the DSSTE method. Reference [36] applied an NB approach and achieved a high accuracy of 97.14%, a stellar F1 score of 97.94%, and a precision of 96.72%.…”
Section: Comparative Analysismentioning
confidence: 99%
“…Boosting models use a sequential iterative technique to combine multiple weak individual learners into one strong learner with a better performance. To generate the output of this strong learner, a relationship based on a weighted average or voting is established between the basic learners with different distributions [73]. Finally, stacking, known as stack generalization, is another type of ensemble learning technique that includes heterogeneous weak learners.…”
Section: Supervised Learningmentioning
confidence: 99%
“…Table 2 provides the characteristics, advantages, and disadvantages of these supervised classification techniques. strong learner, a relationship based on a weighted average or voting is established between the basic learners with different distributions [73]. Finally, stacking, known as stack generalization, is another type of ensemble learning technique that includes heterogeneous weak learners.…”
Section: Supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation