2023
DOI: 10.3390/math11244972
|View full text |Cite
|
Sign up to set email alerts
|

A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation

Zhengyang Fan,
Wanru Li,
Kuo-Chu Chang

Abstract: Estimating the remaining useful life (RUL) of aircraft engines holds a pivotal role in enhancing safety, optimizing operations, and promoting sustainability, thus being a crucial component of modern aviation management. Precise RUL predictions offer valuable insights into an engine’s condition, enabling informed decisions regarding maintenance and crew scheduling. In this context, we propose a novel RUL prediction approach in this paper, harnessing the power of bi-directional LSTM and Transformer architectures… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 58 publications
0
5
0
Order By: Relevance
“…Table 5 shows the comparison results. [56], GCT [36], DCNN [57], ELSTMNN [58], DATCN [59], AGCNN [60], BiLSTM attention model [61], DAST [46], DLformer [37], 1D-CNN-LSTM [62], CNN-LSTM-SAM [63], and BiLSTM-DAE-Transformer [42]. As presented in Table 5, the proposed STAR framework consistently outperforms existing RUL prediction models across all the datasets, showcasing its superior predictive capabilities.…”
Section: Rul Predictionmentioning
confidence: 80%
See 2 more Smart Citations
“…Table 5 shows the comparison results. [56], GCT [36], DCNN [57], ELSTMNN [58], DATCN [59], AGCNN [60], BiLSTM attention model [61], DAST [46], DLformer [37], 1D-CNN-LSTM [62], CNN-LSTM-SAM [63], and BiLSTM-DAE-Transformer [42]. As presented in Table 5, the proposed STAR framework consistently outperforms existing RUL prediction models across all the datasets, showcasing its superior predictive capabilities.…”
Section: Rul Predictionmentioning
confidence: 80%
“…Rooted in the Transformer architecture and incorporating a gated mechanism, the BGT model effectively quantifies both epistemic and aleatory uncertainties and providing risk-aware RUL predictions. Most recently, Fan et al [42] introduced the BiLSTM-DAE-Transformer framework for RUL prediction, utilizing the Transformer's encoder as the framework's backbone and integrating it with a self-supervised denoising autoencoder that employs BiLSTM for enhanced feature extraction.…”
Section: Related Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…The experimental study confirmed that the proposed model outperforms the CNN, LSTM, and Auto-encoder models. In [18], the raw data are first denoised using a Bi-LSTM autoencoder before being passed into the Transformer encoder for RUL prediction. The denoising step's effectiveness was confirmed through various experiments.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
“…In the literature, several DL models have been proposed to estimate the RUL of rotating machines. These models often consist of simple neural network architectures, including a series of several Long Short-Term Memory (LSTM) or convolutional layers [3,17,18]. The convolutional layers are directed towards the identification of spatial dependencies within time series data, while the LSTM layers excel at identifying and capturing long-term correlations.…”
Section: Introductionmentioning
confidence: 99%