2022
DOI: 10.1109/jsen.2022.3214608
|View full text |Cite
|
Sign up to set email alerts
|

Remaining Useful Life Prediction for Circuit Breaker Based on Opening-Related Vibration Signal and SA-CNN-GRU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(3 citation statements)
references
References 24 publications
0
3
0
Order By: Relevance
“…This could include using a network (e.g., LSTM) to preprocess the data for subsequent processes or networks (e.g., Wiener process) [94], or using different networks in parallel. For example, a CNN-GRU [224], CNN-LSTM [121], TCN-LSTM [225], or LSTM-PF [188] architecture can combine the feature extraction capabilities of the CNN and the temporal modeling capabilities of the GRU or LSTM to create a superior deep learning architecture [98,126]. Since CNNs are designed to handle local, short-term dependencies, they can benefit from other methods that deal with long-term dependencies [50].…”
Section: ) Collaborative Architecturesmentioning
confidence: 99%
“…This could include using a network (e.g., LSTM) to preprocess the data for subsequent processes or networks (e.g., Wiener process) [94], or using different networks in parallel. For example, a CNN-GRU [224], CNN-LSTM [121], TCN-LSTM [225], or LSTM-PF [188] architecture can combine the feature extraction capabilities of the CNN and the temporal modeling capabilities of the GRU or LSTM to create a superior deep learning architecture [98,126]. Since CNNs are designed to handle local, short-term dependencies, they can benefit from other methods that deal with long-term dependencies [50].…”
Section: ) Collaborative Architecturesmentioning
confidence: 99%
“…This could include using a network (e.g., LSTM) to preprocess the data for subsequent processes or networks (e.g., Wiener process) [109], or using different networks in parallel. For example, a CNN-GRU [243], CNN-LSTM [136], TCN-LSTM [244], or LSTM-PF [204] architecture can combine the feature extraction capabilities of the CNN and the temporal modeling capabilities of the GRU or LSTM to create a superior deep learning architecture [113,141]. Since CNNs are designed to handle local, short-term dependencies, they can benefit from other methods that deal with long-term dependencies [50].…”
Section: ) Collaborative Architecturesmentioning
confidence: 99%
“…This could include using a network (e.g., LSTM) to pre-process the data for subsequent processes or networks (e.g., Wiener process) [109], or using different networks in parallel. For example, a CNN-GRU [243], CNN-LSTM [136], TCN-LSTM [244], or LSTM-PF [204] architecture can combine the feature extraction capabilities of the CNN and the temporal modeling capabilities of the GRU or LSTM to create a superior deep learning architecture [113], [141]. Since CNNs are designed to handle local, short-term dependencies, they can benefit from other methods that deal with long-term dependencies [50].…”
Section: ) Collaborative Architecturesmentioning
confidence: 99%