2015
DOI: 10.3390/s150716225
|View full text |Cite
|
Sign up to set email alerts
|

Bearing Fault Diagnosis Based on Statistical Locally Linear Embedding

Abstract: Fault diagnosis is essentially a kind of pattern recognition. The measured signal samples usually distribute on nonlinear low-dimensional manifolds embedded in the high-dimensional signal space, so how to implement feature extraction, dimensionality reduction and improve recognition performance is a crucial task. In this paper a novel machinery fault diagnosis approach based on a statistical locally linear embedding (S-LLE) algorithm which is an extension of LLE by exploiting the fault class label information … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
56
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 128 publications
(56 citation statements)
references
References 37 publications
0
56
0
Order By: Relevance
“…To testify the superiority of our method, five common dimensionality reduction tools combined with sparse filtering are adopted to process the gearbox dataset respectively. The five tools are: principal component analysis (PCA) [25], locality preserving projection (LPP) [26], Sammon mapping (SM) [27], linear discriminant analysis (LDA) [28], and stochastic proximity embedding (SPE) [29]. The classification results by the five methods are shown in Fig.…”
Section: Diagnosis Resultsmentioning
confidence: 99%
“…To testify the superiority of our method, five common dimensionality reduction tools combined with sparse filtering are adopted to process the gearbox dataset respectively. The five tools are: principal component analysis (PCA) [25], locality preserving projection (LPP) [26], Sammon mapping (SM) [27], linear discriminant analysis (LDA) [28], and stochastic proximity embedding (SPE) [29]. The classification results by the five methods are shown in Fig.…”
Section: Diagnosis Resultsmentioning
confidence: 99%
“…Table 10 shows classification accuracies of different methods, including Genetic Algorithm with Random Forest [22], Chi Square Features with different classifiers [23], Continuous Wavelet Transform with SVM, Discrete Wavelet Transform with ANN [24], Statistical Locally Linear Embedding with SVM [8] and the method proposed in this paper.…”
Section: Discussionmentioning
confidence: 99%
“…In this field, time domain feature methods [1,2,3], including Kernel Density Estimation (KDE), Root Mean Square (RMS), Crest factor, Crest-Crest Value and Kurtosis, frequency domain features [4] such as the frequency spectrum generated by Fourier transformation, time-frequency features obtained by Wavelet Packet Transform (WPT) [5] are usually extracted as the gauge of the next process. Other signal processing methods such as Empirical Mode Decomposition (EMD) [6], Intrinsic Mode Function (IMF), Discrete Wavelet Transform (DWT), Hilbert Huang Transform (HHT) [7,8], Wavelet Transform (WT) [9] and Principal Component analysis (PCA) [10] are also implemented for signal processing. These signal processing and feature extraction methods are followed by some classification algorithms including Support Vector Machine (SVM), Artificial Neural Networks (ANN), Wavelet Neural Networks (WNN) [11], dynamic neural networks and fuzzy inference.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, it's necessary to find a way to extract useful features that represents the intrinsic information of the machine. Common signal processing techniques used to extract the representative features from the raw signal include time-domain statistical analysis [6], wavelet transformation [7], and Fourier spectral analysis [8]. Usually after feature extraction, a feature selection step will be implemented to get rid of useless and insensitive features, and reduce the dimension for the sake of computational efficiency.…”
Section: Introductionmentioning
confidence: 99%