2018 International Conference on Sensing,Diagnostics, Prognostics, and Control (SDPC) 2018
DOI: 10.1109/sdpc.2018.8664979
|View full text |Cite
|
Sign up to set email alerts
|

Gear Crack Level Classification by Using KNN and Time-Domain Features from Acoustic Emission Signals Under Different Motor Speeds and Loads

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…KNN mainly targets the nearest neighbors’ labels using two instances with p features, x i = { x i 1 , x i 2 ,…, x i,p } and x j = { x j 1 , x j 2 ,…, x j,p } of the training data set, where i = 1,2,…, n , j = 1,2,…, n , and n being the total sample number. A metric d ( x i , xj )) which can be Manhattan, Euclidean, chi-square, or cosine among other distance measures is used to determine the distance between the two instances (Sanchez et al, 2018). Basically, for labeling a new class, KNN algorithm locates k neighbors in the training data set, with the nearest possible distances based on the used metric; thereafter, KNN selects the most dominating class among the KNN.…”
Section: Mathematical Backgroundmentioning
confidence: 99%
See 1 more Smart Citation
“…KNN mainly targets the nearest neighbors’ labels using two instances with p features, x i = { x i 1 , x i 2 ,…, x i,p } and x j = { x j 1 , x j 2 ,…, x j,p } of the training data set, where i = 1,2,…, n , j = 1,2,…, n , and n being the total sample number. A metric d ( x i , xj )) which can be Manhattan, Euclidean, chi-square, or cosine among other distance measures is used to determine the distance between the two instances (Sanchez et al, 2018). Basically, for labeling a new class, KNN algorithm locates k neighbors in the training data set, with the nearest possible distances based on the used metric; thereafter, KNN selects the most dominating class among the KNN.…”
Section: Mathematical Backgroundmentioning
confidence: 99%
“…Liu et al (2018) have also made a review of artificial intelligence-based methods including K-nearest neighbors (KNN), support vector machine (SVM), ANN, Naive Bayes, and deep learning for rotating machines fault diagnosis. KNN has been widely implemented for defect detection and classification in several works (Sanchez et al, 2018;Tayyab et al, 2021aTayyab et al, , 2021b. KNN is computed based on the distances between the sample points and the nearest neighbors of the assigned set of points (Kherif et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Sanchez et al. 20 also proposed fault diagnosis based on the features extracted from vibration and AE signals in gearboxes, 21,22 using kNN algorithm. The proposed model used the relief F ranking over the four and five signal features to achieve the accuracy of 98.11% and 98% respectively.…”
Section: Knn Algorithm In Fault Diagnosismentioning
confidence: 99%
“…RF has also been applied for rotating machine defects diagnosis [43] and for air compressors fault diagnosis again based on vibration analysis [44]. Besides RF, ensemble tree (ET) K-nearest neighbors (KNN) have been also widely implemented for defect detection and classification in several works [45][46][47][48][49]. Although considerable research has been carried out on fault diagnosis using machine learning technique, works on reciprocating air compressors using acoustic signals are very restricted due to the variety of faults and failures that occur in these electromechanical devices.…”
Section: Introductionmentioning
confidence: 99%