2023
DOI: 10.32604/cmc.2023.031907
|View full text |Cite
|
Sign up to set email alerts
|

Intrusion Detection Based on Bidirectional Long Short-Term Memory with Attention Mechanism

Abstract: With the recent developments in the Internet of Things (IoT), the amount of data collected has expanded tremendously, resulting in a higher demand for data storage, computational capacity, and real-time processing capabilities. Cloud computing has traditionally played an important role in establishing IoT. However, fog computing has recently emerged as a new field complementing cloud computing due to its enhanced mobility, location awareness, heterogeneity, scalability, low latency, and geographic distribution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(4 citation statements)
references
References 35 publications
(37 reference statements)
0
4
0
Order By: Relevance
“…Yang et al [19] have conducted an intrusion detection model based on a two-layered bidirectional long short-term memory (Bi-LSTM) on fog. In the rst stage fog nodes identify attacks based on tra c received from IoT devices and then send them to the cloud to summarize the global security condition.…”
Section: Related Workmentioning
confidence: 99%
“…Yang et al [19] have conducted an intrusion detection model based on a two-layered bidirectional long short-term memory (Bi-LSTM) on fog. In the rst stage fog nodes identify attacks based on tra c received from IoT devices and then send them to the cloud to summarize the global security condition.…”
Section: Related Workmentioning
confidence: 99%
“…Several experiments were designed, and the results proved that HAT-UDA improves detection rates of unknown attacks significantly. Yongjie et al [33] analyzed network traffic data over time. Their model was based on bi-directional long and short-term memory (Bi-LSTM) and was endowed with an attention mechanism to classify traffic data.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, the attention mechanism [16], an important technique in deep learning, allows models to "focus" on different parts by assigning different "attention" to them, thereby enhancing the model's ability to focus on the most relevant information in text and improving the ability to process complex language features. For example, An and colleagues [17] proposed a financial system attack detection model called Finsformer based on a transformer and synchronized attention mechanism.…”
Section: Introductionmentioning
confidence: 99%