2023
DOI: 10.3390/s23031104
|View full text |Cite
|
Sign up to set email alerts
|

Disentangled Dynamic Deviation Transformer Networks for Multivariate Time Series Anomaly Detection

Abstract: Graph neural networks have been widely used by multivariate time series-based anomaly detection algorithms to model the dependencies of system sensors. Previous studies have focused on learning the fixed dependency patterns between sensors. However, they ignore that the inter-sensor and temporal dependencies of time series are highly nonlinear and dynamic, leading to inevitable false alarms. In this paper, we propose a novel disentangled dynamic deviation transformer network (D3TN) for anomaly detection of mul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 47 publications
0
3
0
Order By: Relevance
“…D3TN: Disentangled Dynamic Deviation Transformer Network (D3TN) [52] is highly effective system for multivariate time series anomaly detection. It considers both short-term and long-term temporal dependencies as well as complex intersensor dependencies.…”
Section: Transformermentioning
confidence: 99%
“…D3TN: Disentangled Dynamic Deviation Transformer Network (D3TN) [52] is highly effective system for multivariate time series anomaly detection. It considers both short-term and long-term temporal dependencies as well as complex intersensor dependencies.…”
Section: Transformermentioning
confidence: 99%
“…Although LSTM-based methods achieve satisfying performances in some anomaly detection tasks, their highly computationally expensive and inefficient long temporal pattern learning capability still limits their performances. Recently, innovations based on the vanilla Transformer [22] have been widely applied in various fields such as natural language processing (NLP) [28,29], computer vision (CV) [30,31,32,33] and time series applications [34,35,36]. The vanilla Transformer has four key modules: the attention module, feed-forward module, layer normalisation module, and positional encoding module [40].…”
Section: Transfomermentioning
confidence: 99%
“…Through the analysis of time series data, it is possible to reveal the underlying patterns and laws in the data, discover the correlation and periodicity between events, and then deeply understand the nature and mechanism of the event itself, providing strong support for research in related disciplines. Specifically, a deep understanding of time trends [ 14 ], periodicity [ 15 , 16 ], correlation [ 17 , 18 ], etc., can be gained and valuable information can be further extracted, such as anomaly detection [ 19 , 20 , 21 ], classification [ 22 , 23 , 24 ], clustering [ 25 , 26 ], etc. These studies require a large amount of time series data for experiments to test the effectiveness and practicality of different algorithms and techniques, optimize the parameters and structure of algorithms, evaluate the performance and accuracy of different techniques, and train machine learning models.…”
Section: Introductionmentioning
confidence: 99%