2021
DOI: 10.1007/s12206-021-0709-7
|View full text |Cite
|
Sign up to set email alerts
|

Reliability improvement in the presence of weak fault features using non-Gaussian IMF selection and AdaBoost technique

Abstract: In machinery fault detection and identification (FDI), decomposing vibration signals into corresponding intrinsic mode functions (IMFs) reduces the intricacy in extracting weak fault features at the early failure state. However, selecting a suitable IMF for fault information extraction is a challenging task. Analyzing the non-Gaussian IMFs allows extracting effective fault-related information rather than the entire signal or other IMFs because the vibration signals are random in nature. In this study, we prese… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…It is an implementation of boosting algorithm and an efficient and accurate classification algorithm. Since its invention, it has been widely used in various fields [25]. The working mechanism of the Adaboost algorithm is as follows: (a) the training set and a weak learning algorithm are determined, and an initial weight is assigned to each training sample.…”
Section: Design and Improvement Of Feature Fusermentioning
confidence: 99%
“…It is an implementation of boosting algorithm and an efficient and accurate classification algorithm. Since its invention, it has been widely used in various fields [25]. The working mechanism of the Adaboost algorithm is as follows: (a) the training set and a weak learning algorithm are determined, and an initial weight is assigned to each training sample.…”
Section: Design and Improvement Of Feature Fusermentioning
confidence: 99%
“…Adaboost's key disadvantage is that it requires a high-quality dataset. Before using an Adaboost algorithm, avoid using noisy data and outliers [53,54]. In a supervised situation, when we are provided with a dataset with target labels, k nearest neighbours (KNN) can be utilized for classification.…”
Section: Overview Of Machine Learning-based Algorithmsmentioning
confidence: 99%
“…The (FP) value is the proportion of wrongly classed predictions, while the (FN) value represents the proportion of improperly labelled inputs belonging to a falsely classified class. Since it is the best criterion for implementing and selecting the appropriate model, the model with the lowest (FN) is the best choice [46][47][48][49][50].…”
Section: Supervised Learning Algorithm Evaluation Metricsmentioning
confidence: 99%