2016
DOI: 10.3390/s16122069
|View full text |Cite
|
Sign up to set email alerts
|

Fault Detection Using the Clustering-kNN Rule for Gas Sensor Arrays

Abstract: The k-nearest neighbour (kNN) rule, which naturally handles the possible non-linearity of data, is introduced to solve the fault detection problem of gas sensor arrays. In traditional fault detection methods based on the kNN rule, the detection process of each new test sample involves all samples in the entire training sample set. Therefore, these methods can be computation intensive in monitoring processes with a large volume of variables and training samples and may be impossible for real-time monitoring. To… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
45
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(45 citation statements)
references
References 37 publications
0
45
0
Order By: Relevance
“…Previous studies have adopted several conventional algorithms for classifying sitting postures such as the Hidden Markov Models (HMM), Naïve Bayes (NB) classifier and k-nearest neighbor (kNN) classifier [8][9][10]. Conventional machine learning algorithms including neural network, support vector machine (SVM), and kNN are still being adopted in various research objectives such as fault diagnosis, wind speed prediction, and thermal anomalies identification [11][12][13][14]. Recently, it has been proven that high performance can be obtained by using deep learning in various research fields such as image processing, and speech recognition.…”
Section: Introductionmentioning
confidence: 99%
“…Previous studies have adopted several conventional algorithms for classifying sitting postures such as the Hidden Markov Models (HMM), Naïve Bayes (NB) classifier and k-nearest neighbor (kNN) classifier [8][9][10]. Conventional machine learning algorithms including neural network, support vector machine (SVM), and kNN are still being adopted in various research objectives such as fault diagnosis, wind speed prediction, and thermal anomalies identification [11][12][13][14]. Recently, it has been proven that high performance can be obtained by using deep learning in various research fields such as image processing, and speech recognition.…”
Section: Introductionmentioning
confidence: 99%
“…In the feature extraction section, the PCA [ 25 ] and the FFT [ 26 ] algorithms are used to extract pyroelectric features. In the recognition section, we can get the different recognition results through the data analysis, including FuzzyK [ 27 ], Kmean [ 28 ], KNN [ 29 ], SVM [ 30 ], Bayes [ 31 ], Fisher [ 32 ], BP [ 33 ].…”
Section: Methodsmentioning
confidence: 99%
“…A k-nearest neighbor (KNN) is a simple classifier that takes a decision about the test input by simply looking at the k-nearest points in the training set [23]. The classifier counts the number of members from each class in this set of k-nearest neighbors to test the input and classifies the test signal in the class having the highest number of members.…”
Section: K-nearest Neighbormentioning
confidence: 99%