2021
DOI: 10.1007/s40747-021-00606-4
|View full text |Cite
|
Sign up to set email alerts
|

Machine remaining life prediction based on multi-layer self-attention and temporal convolution network

Abstract: Convolution neural network (CNN) has been widely used in the field of remaining useful life (RUL) prediction. However, the CNN-based RUL prediction methods have some limitations. The receptive field of CNN is limited and easy to happen gradient vanishing problem when the network is too deep. The contribution differences of different channels and different time steps to RUL prediction are not considered, and only use deep learning features or handcrafted statistical features for prediction. These limitations ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 32 publications
0
10
0
Order By: Relevance
“…Self-attention mechanism [31][32][33] is an important improvement on traditional attention mechanism and plays a key role in neural networks. It aims to capture the internal correlations of data and can help the model focus more on informative information that makes a significant contribution to the output.…”
Section: Sm Unitmentioning
confidence: 99%
“…Self-attention mechanism [31][32][33] is an important improvement on traditional attention mechanism and plays a key role in neural networks. It aims to capture the internal correlations of data and can help the model focus more on informative information that makes a significant contribution to the output.…”
Section: Sm Unitmentioning
confidence: 99%
“…To validate the superiority of the proposed method, the performance of the NT-TCN model is compared to other state-of-the-art models. In this part, some classic and latest methods including CNN [3], LSTM [9], DCNN [5], BiLSTM [15], MS-DCNN [7], ALSTM [18], AGCNN [1], Cap-LSTM [17], BiGRU-MMoE [13], MLSA-TCN [24] and our proposed NT-TCN are considered for comparison. Table 6 shows the results.…”
Section: Comparisons With Other Approachesmentioning
confidence: 99%
“…It performs better than standard LSTM and GRU on multiple tasks. Considering the contribution differences between different channels and different time steps, Shang et al [24] proposed an RUL prediction method based on multi-layer self-attention and TCN (MLSA-TCN). Although TCN shows great advantages in residual life prediction, it has not attracted enough attention.…”
Section: Introductionmentioning
confidence: 99%
“…Self-attention models can effectively handle variablelength time series and establish long-range dependencies [32].…”
Section: International Journal Of Aerospace Engineeringmentioning
confidence: 99%