2022
DOI: 10.1109/access.2022.3159550
|View full text |Cite
|
Sign up to set email alerts
|

An Abnormal Traffic Detection Model Combined BiIndRNN With Global Attention

Abstract: As time series data with internal correlation, airborne networks traffic data can be used for abnormal detection using Recurrent Neural Network (RNN) and its variants, but existing models are difficult to calculate in parallel and gradient explosion or vanishing easily occurs. To address this problem, we propose a Bidirectional Independent Recurrent Neural Network (BiIndRNN) with parallel computation and adjustable gradient, which can extract the bidirectional structural features of network traffic by forward … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…Based on our understanding of neural networks [34][35][36][37][38], we hypothesized that the Resnet residual network effectively captures the spatial (2D) feature information of traffic data. However, it overlooks the sequence features inherent in traffic data, which may result in incomplete feature extraction and consequently impact the detection of malicious traffic.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Based on our understanding of neural networks [34][35][36][37][38], we hypothesized that the Resnet residual network effectively captures the spatial (2D) feature information of traffic data. However, it overlooks the sequence features inherent in traffic data, which may result in incomplete feature extraction and consequently impact the detection of malicious traffic.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…By introducing the gating mechanism, LSTM, which is based on recurrent neural network (RNN), can effectively solve the gradient explosion and gradient disappearance problems brought by traditional RNNs (Li et al, 2022). Figure 1 shows the schematic diagram of the basic principle of LSTM.…”
Section: General Long Short-term Memory Modelmentioning
confidence: 99%