2016
DOI: 10.3390/info7020020
|View full text |Cite
|
Sign up to set email alerts
|

A Big Network Traffic Data Fusion Approach Based on Fisher and Deep Auto-Encoder

Abstract: Data fusion is usually performed prior to classification in order to reduce the input space. These dimensionality reduction techniques help to decline the complexity of the classification model and thus improve the classification performance. The traditional supervised methods demand labeled samples, and the current network traffic data mostly is not labeled. Thereby, better learners will be built by using both labeled and unlabeled data, than using each one alone. In this paper, a novel network traffic data f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(11 citation statements)
references
References 25 publications
0
11
0
Order By: Relevance
“…A DAE is a feedforward neural network strategy for fast unsupervised computing execution [85]. It investigates the estimation of a unique task, where the result ðxÞ is equal to the input ðxÞ to construct a definition of a collection of data, that is, ðx ⟶ xÞ, ðxÞ.…”
Section: Deep Autoencoder (Dae)mentioning
confidence: 99%
See 1 more Smart Citation
“…A DAE is a feedforward neural network strategy for fast unsupervised computing execution [85]. It investigates the estimation of a unique task, where the result ðxÞ is equal to the input ðxÞ to construct a definition of a collection of data, that is, ðx ⟶ xÞ, ðxÞ.…”
Section: Deep Autoencoder (Dae)mentioning
confidence: 99%
“…An AE's simplest framework comprises three layers: input, secret, and output. If the training data ðx ðĩÞ Þ has n samples, each ðx ðĩÞ Þðiϵð1, ⋯, nÞÞ has several proportions, as well as a spatial function vector (d0); the Tanhinitiation function [85] is used and calculated using…”
Section: Deep Autoencoder (Dae)mentioning
confidence: 99%
“…Ettercap [20,26], which is depicted in Figure 2, is mainly used in Kali Linux Devices. It is used to perform different types of attacks on our device that is running the Ettercap, used as a malicious node to perform the man-in-the-middle attack.…”
Section: Methodsmentioning
confidence: 99%
“…Their system was evaluated on the KDD Cup 99 dataset, and the results showed that combining an autoencoder and DBN can achieve better detection accuracy than working with DBN alone. Tao et al [ 33 ] proposed a data fusion approach based on the Fisher score and deep autoencoder to reduce the dimensionality of data. Using the KDD Cup 99 dataset, it was confirmed that integrating the deep autoencoder as a feature extraction method can improve the accuracy of classification algorithms such as J48, the backpropagation neural network, and SVM.…”
Section: Related Workmentioning
confidence: 99%