2022
DOI: 10.2298/fuee2202269p
|View full text |Cite
|
Sign up to set email alerts
|

Wk-fnn design for detection of anomalies in the computer network traffic

Abstract: Anomaly-based intrusion detection systems identify abnormal computer network traffic based on deviations from the derived statistical model that describes the normal network behavior. The basic problem with anomaly detection is deciding what is considered normal. Supervised machine learning can be viewed as binary classification, since models are trained and tested on a data set containing a binary label to detect anomalies. Weighted k-Nearest Neighbor and Feedforward Neural Network are high-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 42 publications
0
6
0
Order By: Relevance
“…The accuracy of k-NN models was evaluated using the ADFA-WF and CAIDA datasets as benchmarks, and the results were presented in [104]. According to the authors of [105], the accuracy of the FNN and wk-NN models is over 99.0% and 99.1%, respectively. The authors of [85] used SVM, DT, k-NN, and linear discriminant analysis (LDA) to perform classification based on the novel X-IIoTID dataset.…”
Section: Binary Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…The accuracy of k-NN models was evaluated using the ADFA-WF and CAIDA datasets as benchmarks, and the results were presented in [104]. According to the authors of [105], the accuracy of the FNN and wk-NN models is over 99.0% and 99.1%, respectively. The authors of [85] used SVM, DT, k-NN, and linear discriminant analysis (LDA) to perform classification based on the novel X-IIoTID dataset.…”
Section: Binary Classificationmentioning
confidence: 99%
“…The change between the GD and GN algorithms is called the damping strategy [111]. The LM approximation and the damping strategy are described more detailed in our previous work [105,112]. The damping strategy is used here to speed up processing and prevent problems due to a very large or very small gradients.…”
Section: Proposed Workmentioning
confidence: 99%
“…In the previous work, the authors have shown positive properties of the hyperbolic-tangent scaling, and the goal of improvements based on the Levenberg-Marquardt algorithm. The authors have also shown the benefits of the XOR operation to the detection of conflicting decisions [13][14][15]. The main idea of this research is that with the XOR detection of contradictory decisions on anomalies, it is possible to increase the detection level of malware and bugs in computer networks, improved by the use of accumulator register that deals with triggered outputs of the classifiers.…”
Section: Introductionmentioning
confidence: 99%
“…A k-NN is the most well-known distance-based algorithm that assigns a new instance to a class to which most of its k nearest neighbors belong [12,13]. A k-NN model with k = 10 and a similarity measure is based on Euclidean distance because of its robustness to noisy data, flexibility, and easy implementation [14]. The wk-NN model is used because it extends the k-NN model to improve the accuracy by heavily weighting neighbors in the decision who are closer to the new instance than neighbors who are more distant [15].…”
Section: Introductionmentioning
confidence: 99%
“…Due to its high prediction speed and low memory costs, medium DT (Iterative Dichotomiser 3 algorithm) with 20 splits is used [16]. A feedforward neural network (FNN) with one hidden layer (nine input, nine hidden, and one output neuron) is used due to its fast processing speed and generalization ability [14,15]. It is one of the simplest and quickest models that rely on backpropagation to produce results based on the predicted probabilities and classification thresholds.…”
Section: Introductionmentioning
confidence: 99%