2024
DOI: 10.1109/tnnls.2023.3282599
|View full text |Cite
|
Sign up to set email alerts
|

WPConvNet: An Interpretable Wavelet Packet Kernel-Constrained Convolutional Network for Noise-Robust Fault Diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 45 publications
0
3
0
Order By: Relevance
“…One sample from each class in both gearbox datasets is selected for visualization. The global features consist of 128 × 1 channels, each representing a feature in the following frequency band of a wavelet basis [40]. First, the obtained 128-channel feature mapping is reordered according to its corresponding frequency (from low to high) and normalized.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…One sample from each class in both gearbox datasets is selected for visualization. The global features consist of 128 × 1 channels, each representing a feature in the following frequency band of a wavelet basis [40]. First, the obtained 128-channel feature mapping is reordered according to its corresponding frequency (from low to high) and normalized.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…The convolution layer and activation function play a significant role in feature extraction within deep neural networks [33]. Previous studies have drawn parallels between convolutional operations and wavelet decomposition, substituting the first convolutional layer with a wavelet convolutional layer to capitalize on the unique filtering capabilities of wavelet decomposition.…”
Section: Atrafmentioning
confidence: 99%
“…To overcome the above drawbacks, the interpretable of CNNs are gradually attracting attention in the field of Interpretable Deep Learning (IDL) [15,16]. Currently, interpretable methods can be categorized as pre-methods and post-methods.…”
Section: Introductionmentioning
confidence: 99%