2022
DOI: 10.1080/08839514.2022.2145642
|View full text |Cite
|
Sign up to set email alerts
|

BERT-Log: Anomaly Detection for System Logs Based on Pre-trained Language Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(11 citation statements)
references
References 39 publications
0
11
0
Order By: Relevance
“…Figure 3 shows how a test-taker’s actions for U01A – Party Invitations were transformed into a vector of token indices. In the final step, we ran the three vectors (token, positional, and segment embeddings) through the BERT embedding layer to form a 768-dimensional embedding vector based on the last layer (S. Chen & Liao, 2022).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Figure 3 shows how a test-taker’s actions for U01A – Party Invitations were transformed into a vector of token indices. In the final step, we ran the three vectors (token, positional, and segment embeddings) through the BERT embedding layer to form a 768-dimensional embedding vector based on the last layer (S. Chen & Liao, 2022).…”
Section: Methodsmentioning
confidence: 99%
“…In recent years, various deep learning methods have been proposed for anomaly detection in the system log files generated by an operating system, application, server, and so on (e.g., Catillo et al, 2022;Le & Zhang, 2021;Wittkopp et al, 2021). Furthermore, some researchers have combined these methods with NLP techniques to facilitate the processing of complex messages in log files (e.g., S. Chen & Liao, 2022;Guo et al, 2021;Ryciak et al, 2022;Shao et al, 2022). With pretrained large language models such as BERT (Devlin et al, 2018), sequential log entries over time or across multiple users are transformed into dense, numerical vectors (i.e., embeddings) that retain the order and dependency of the entries in a lower dimensional space.…”
Section: Anomaly Detection With Process Datamentioning
confidence: 99%
See 1 more Smart Citation
“…These methods utilize the self-attention mechanism of the Transformer to analyze relationships within event logs, thereby improving anomaly detection accuracy. LogBERT [11]: LogBERT is a log anomaly detection method based on pre-trained language models. It draws inspiration from the BERT model used in natural language processing and applies it to event log analysis.…”
Section: Classification Of Log Anomaly Analysis Methodsmentioning
confidence: 99%
“…Table III furnishes the list of state-of-art anomaly detection tools, vector generation techniques, and log features. [22] 2019   Swisslog [31] 2020    Logtransfer [32] 2020   HitAnomaly [33] 2020    LogFlow [34] 2021   Sprelog [17] 2021   LogUAD [13] 2022   DeepSyslog [26] 2022   (Event Metadata BERT-Log [35] 2022…”
Section: Comparison With Existing Techniquesmentioning
confidence: 99%