2018 IEEE 24th International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA) 2018
DOI: 10.1109/rtcsa.2018.00035
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Attention-Based Anomaly Detection Model for Embedded Operating Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 20 publications
(13 citation statements)
references
References 15 publications
0
13
0
Order By: Relevance
“…With the development in deep learning techniques, Ezeme et al introduced a hierarchical attention-based anomaly detection (HAbAD) model based on stacked Long Short-Term Memory (LSTM) Networks with Attention [9]. Yahyaoui et al proposed a detection protocol that dynamically executes the on-demand Support Vector Machine (SVM) classifier in a hierarchical way whenever an intrusion is suspected [10].…”
Section: Related Workmentioning
confidence: 99%
“…With the development in deep learning techniques, Ezeme et al introduced a hierarchical attention-based anomaly detection (HAbAD) model based on stacked Long Short-Term Memory (LSTM) Networks with Attention [9]. Yahyaoui et al proposed a detection protocol that dynamically executes the on-demand Support Vector Machine (SVM) classifier in a hierarchical way whenever an intrusion is suspected [10].…”
Section: Related Workmentioning
confidence: 99%
“…Still, they do not consider the temporal relationship amongst the sequence of the system calls. Also, in [17], [18], the hierarchical LSTM network is used to explore the understanding of relationships amongst the kernel event traces of an embedded system, but other features that ordinarily should yield a more representative model like timestamps, CPU cycles, and system call arguments are skipped.…”
Section: Related Workmentioning
confidence: 99%
“…Ezeme et al [9], [13], Salem et al [18] [9], [13], [18] have used the same dataset for model validation. Although [9], [18] models are offline-based models and do not take temporal relationships in the sequence of events into account, we compare DReAM with theirs as those are the works we know targeting anomaly detection in an embedded system via trace analysis that have used the same dataset for model validation.…”
Section: B Dream Vs Other Modelsmentioning
confidence: 99%
“…Although [9], [18] models are offline-based models and do not take temporal relationships in the sequence of events into account, we compare DReAM with theirs as those are the works we know targeting anomaly detection in an embedded system via trace analysis that have used the same dataset for model validation. We also include the HAbAD model of [13] in the comparison as this work builds on its architecture to reduce the high incidences of the false positive rate recorded in the HAbAD model. Table 3 displays the true and false positive rates from those works compared to DReAM.…”
Section: B Dream Vs Other Modelsmentioning
confidence: 99%
See 1 more Smart Citation