2022
DOI: 10.3390/machines10100879
|View full text |Cite
|
Sign up to set email alerts
|

Lightweight Network with Variable Asymmetric Rebalancing Strategy for Small and Imbalanced Fault Diagnosis

Abstract: Deep learning-related technologies have achieved remarkable success in the field of intelligent fault diagnosis. Nevertheless, the traditional intelligent diagnosis methods are often based on the premise of sufficient annotation signals and balanced distribution of classes, and the model structure is so complex that it requires huge computational resources. To this end, a lightweight class imbalanced diagnosis framework based on a depthwise separable Laplace-wavelet convolution network with variable-asymmetric… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 49 publications
0
5
0
Order By: Relevance
“…Furthermore, during the Dataset organization process, adjusting the number of Samples in each category to the desired quantity proves to be a highly challenging task. Similarly, in the context of imbalanced Classes, when N f ault /N Normal ≤ 0.1, it is referred to as an extremely imbalanced problem [35], where N f ault represents the number of fault Samples and N Normal represents the number of Normal Samples. To thoroughly assess the proposed model's performance on small and imbalanced Datasets, four different Datasets (A, B, C, and D) were created, with the details provided in Table 3.…”
Section: Accuracy =mentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, during the Dataset organization process, adjusting the number of Samples in each category to the desired quantity proves to be a highly challenging task. Similarly, in the context of imbalanced Classes, when N f ault /N Normal ≤ 0.1, it is referred to as an extremely imbalanced problem [35], where N f ault represents the number of fault Samples and N Normal represents the number of Normal Samples. To thoroughly assess the proposed model's performance on small and imbalanced Datasets, four different Datasets (A, B, C, and D) were created, with the details provided in Table 3.…”
Section: Accuracy =mentioning
confidence: 99%
“…This further demonstrates that the proposed method can Balance Global and local fault time-frequency bands, providing excellent modeling and interference resistance capabilities. To validate the superiority of the intelligent fault diagnosis method proposed in this paper, we performed comparative experiments with other recent models, such as EWSNet [39], DCA-BiGRU [34], and DSLWCN-VAFL [35]. To maintain a standard of impartiality, it is crucial to emphasize that the hyperparameters associated with the models discussed are harmonized with those adopted in the methodology presented within this paper.…”
Section: Ablation Experimentsmentioning
confidence: 99%
“…The experimental results show that this method can diagnose the fault of small sample bearing and has strong universality and robustness. Chen [15] combined wavelet and depthwise separable convolutional neural networks to design a few-parameter branch for time-frequency feature extraction. This branch captured fault features from a limited number of samples to realize fault diagnosis under small samples together with regular convolution.…”
Section: Research Status Of Small Sample Issues In Fault Diagnosismentioning
confidence: 99%
“…Chen et al proposed using two separate hyperparameters γ pos , γ neg within the focal loss for fault detection, intrinsically presenting greater tunability to the loss function. 8 Note, one potential reason that the AFL is adept at handling class imbalanced data is that not only does the focal loss component resolve difficult misclassifications but also the FL term (for y = 0 here) is inherently under-weighted versus the BCE component (for y = 1).…”
Section: Cost-sensitive Loss Functions 231 Class Imbalance and The As...mentioning
confidence: 99%
“…One key contribution of this work is a mechanism for minimizing the total economic misclassification cost R as functions of misclassification cost ratio C and classification threshold τ in the absence of quantitative industrial misclassification costs. Another key contribution is the application of a scaling term to the asymmetric focal loss (AFL) loss function, [5][6][7][8] yielding a scaled AFL (sAFL) with average magnitudes comparable to the binary cross-entropy (BCE). This scaling removes intrinsic average weighting effects of the AFL permitting a clearer understanding of how differences in functional curvature within the loss function can contribute to optimizing total cost.…”
Section: Introductionmentioning
confidence: 99%