2023
DOI: 10.3390/electronics12204282
|View full text |Cite
|
Sign up to set email alerts
|

A Sparse Learning Method with Regularization Parameter as a Self-Adaptation Strategy for Rolling Bearing Fault Diagnosis

Yijie Niu,
Wu Deng,
Xuesong Zhang
et al.

Abstract: Sparsity-based fault diagnosis methods have achieved great success. However, fault classification is still challenging because of neglected potential knowledge. This paper proposes a combined sparse representation deep learning (SR-DEEP) method for rolling bearing fault diagnosis. Firstly, the SR-DEEP method utilizes prior domain knowledge to establish a sparsity-based fault model. Then, based on this model, the corresponding regularization parameter regression networks are trained for different running states… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…Fault category Experiment samples Diagnosis accuracy (%) [227] FKP-SGECNN 10 -99.63 [228] Improved graph convolutional network (GCN) 28 1400 99.12 [229] BCMFDE-RF-mRMR-KNN 10 550 99.09 [230] NWMF-CNN 10 4000 99.80 [231] Reinforcement neural architecture search CNN 12 3600 99.65 [232] Dimension expansion and AntisymNet lightweight CNN 10 10 000 99.70 [233] Multi-scale weighted graph-MCGCN 10 3000 99.45 [234] Improved (ICEEMDAN)-ICA-FuEn 10 10 000 99.91 [235] MAM-DSDCNN 7 2800 99.63 [236] Sparse representation deep learning (SR-DEEP) 4 2000 100.00 [237] Online sequential extreme learning machine (OS-ELM) 4 9466 99.62 [238] IFE + CBAM-enhanced InceptionNet 10 1000 99.5 [239] SSCL method based on MSA mechanism and MCL 10 2000 99.97 [240] MRDNN-AG 10 120 000 98.85 [241] AMCEEMD-1DCNN 7 3500 99.50 [242] Modified AlexNet-SVM 4 -99.60 [243] FC-CLDCNN 10 10 000 99.95 [244] PCA-ICEEMDAN and BiLSTM-SCN-CCAM 10 1024 99.92 [245] 2ADA + MK-MMD 10 1960 99.76 [246] 1D feature matching domain adaptation 3 9000 100.00 [247] ICEEMDAN-Hilbert transform-CBAM 10 30 000 95.2 [248] Ensemble MSRCNN-BiLSTM 4 4800 98.43 [249] WKN-BiLSTM-AM 10 1750 99.7 [250] MVO-MOMEDA-SVM 4 400 92.50 [251] WPDPCC-DGCL 10 6000 98.65 [252] I-PixelHop framework based on Spark-GPU 10 -98.93…”
Section: Reference Methods Typementioning
confidence: 99%
“…Fault category Experiment samples Diagnosis accuracy (%) [227] FKP-SGECNN 10 -99.63 [228] Improved graph convolutional network (GCN) 28 1400 99.12 [229] BCMFDE-RF-mRMR-KNN 10 550 99.09 [230] NWMF-CNN 10 4000 99.80 [231] Reinforcement neural architecture search CNN 12 3600 99.65 [232] Dimension expansion and AntisymNet lightweight CNN 10 10 000 99.70 [233] Multi-scale weighted graph-MCGCN 10 3000 99.45 [234] Improved (ICEEMDAN)-ICA-FuEn 10 10 000 99.91 [235] MAM-DSDCNN 7 2800 99.63 [236] Sparse representation deep learning (SR-DEEP) 4 2000 100.00 [237] Online sequential extreme learning machine (OS-ELM) 4 9466 99.62 [238] IFE + CBAM-enhanced InceptionNet 10 1000 99.5 [239] SSCL method based on MSA mechanism and MCL 10 2000 99.97 [240] MRDNN-AG 10 120 000 98.85 [241] AMCEEMD-1DCNN 7 3500 99.50 [242] Modified AlexNet-SVM 4 -99.60 [243] FC-CLDCNN 10 10 000 99.95 [244] PCA-ICEEMDAN and BiLSTM-SCN-CCAM 10 1024 99.92 [245] 2ADA + MK-MMD 10 1960 99.76 [246] 1D feature matching domain adaptation 3 9000 100.00 [247] ICEEMDAN-Hilbert transform-CBAM 10 30 000 95.2 [248] Ensemble MSRCNN-BiLSTM 4 4800 98.43 [249] WKN-BiLSTM-AM 10 1750 99.7 [250] MVO-MOMEDA-SVM 4 400 92.50 [251] WPDPCC-DGCL 10 6000 98.65 [252] I-PixelHop framework based on Spark-GPU 10 -98.93…”
Section: Reference Methods Typementioning
confidence: 99%
“…(1) SR-DEEP [32], (2) BFD-2DCNN [33], (3) LSSA-VMD-GRU [34], (4) WPD-CSSOA-DBN [35], (5) ACPSO-BP [36]. The experimental results are shown in Fig.…”
Section: Comparisonmentioning
confidence: 99%