2021
DOI: 10.1007/978-3-030-86472-9_31
|View full text |Cite
|
Sign up to set email alerts
|

Log-Based Anomaly Detection with Multi-Head Scaled Dot-Product Attention Mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…Transformers (TF) make use of so-called self-attention mechanisms to embed data instances into a vector space, where similar instances should be closer to each other than dissimilar ones [24], [33], [39]. The goal of Transformers is to assign weights to specific inputs according to the context of their occurrence, such as words in sentences.…”
Section: B Deep Learning Techniquesmentioning
confidence: 99%
See 3 more Smart Citations
“…Transformers (TF) make use of so-called self-attention mechanisms to embed data instances into a vector space, where similar instances should be closer to each other than dissimilar ones [24], [33], [39]. The goal of Transformers is to assign weights to specific inputs according to the context of their occurrence, such as words in sentences.…”
Section: B Deep Learning Techniquesmentioning
confidence: 99%
“…comp. Failures [1], [17], [20], [22], [23], [25], [27], [29]- [31], [33], [37]- [40], [42], [44], [46]- [48], [51], [52], [55], [56], [58], [59], [61], [62], [65]- [74], [76]- [78] BlueGene/L (BGL) [89] 2007 High-perf. comp.…”
Section: Data Setmentioning
confidence: 99%
See 2 more Smart Citations
“…[77] also employed the transformer-encoder architecture to develop an unsupervised anomaly detection technique called A2Log. There are other recent research studies that utilized the self-attention with different transformer variants for error and anomaly detection such as LAnoBERT [49], LogAttention [25], and [48]. However, our model utilized self-attention and transformer neural network architecture to predict failures in HPC system components (nodes).…”
Section: Related Workmentioning
confidence: 99%