2018
DOI: 10.1016/j.patcog.2018.05.024
|View full text |Cite
|
Sign up to set email alerts
|

A Nonnegative Locally Linear KNN model for image recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…The Bayes Classifier is used in medicine [43]. The K-NN technique is used for image recognition, in areas such as facial recognition [44]. Perceptron, K-NN, and the Bayes Classifier are also used in weather forecasting [45].…”
Section: Methods and Techniques For Generating Workplace Proceduresmentioning
confidence: 99%
“…The Bayes Classifier is used in medicine [43]. The K-NN technique is used for image recognition, in areas such as facial recognition [44]. Perceptron, K-NN, and the Bayes Classifier are also used in weather forecasting [45].…”
Section: Methods and Techniques For Generating Workplace Proceduresmentioning
confidence: 99%
“…We investigate its hidden neuron size across {10,20, ...100}and the optimal value is 90. Moreover, KNN is an instance-based learning method and we set 100 nearest neighbors for the classification model [45]. It is found from the figure that the EL-SDAE achieves the highest accuracy for the eight subjects among all classifiers.…”
Section: Performance Comparison With Shallow Mw Classifiersmentioning
confidence: 99%
“…The K-nearest neighbors or KNN is a very popular method in data mining and statistics [11], [41]. It is a method that calculates distances among all training samples and every test samples [11], [41]. Then we have the nearest neighbor of all test samples.…”
Section: ) Knnmentioning
confidence: 99%
“…It is clear that KNN has simple implementation and results show that it has a great implementation performance [11], [41]. The KNN contains two major steps [11], [31], [41]: allocation of optimal samples [62] and allocation of different optimal values of k for different samples in test data [62]. To select the optimal k in KNN great efforts have been put [11], [31], [62].…”
Section: ) Knnmentioning
confidence: 99%
See 1 more Smart Citation