“…Several studies have proposed to use self-attention with different transformer variants to detect HPC anomalies. These approaches rely on self-attention-based transformer-decoder [48], self-attention with different transformer-encoder variants [49], [50]. Compared to these approaches, our Time Machine is a real-time generative model for predicting log events, components (e.g., node) failures, and the lead time to the predicted failures in HPC systems via utilizing two stacks of self-supervised transformer-decoders.…”