2017
DOI: 10.3390/sym9090179
|View full text |Cite
|
Sign up to set email alerts
|

NS-k-NN: Neutrosophic Set-Based k-Nearest Neighbors Classifier

Abstract: k-nearest neighbors (k-NN), which is known to be a simple and efficient approach, is a non-parametric supervised classifier. It aims to determine the class label of an unknown sample by its k-nearest neighbors that are stored in a training set. The k-nearest neighbors are determined based on some distance functions. Although k-NN produces successful results, there have been some extensions for improving its precision. The neutrosophic set (NS) defines three memberships namely T, I and F. T, I, and F shows the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 70 publications
(27 citation statements)
references
References 16 publications
0
26
0
1
Order By: Relevance
“…Akbulut et al [25] stated that the KNN method is considered to be one of the oldest and the simplest types of nonparametric classifier. A nonparametric model for KNN means that the classification of the test data does not use a function that has been set up in advance based on the learning process on the training dataset.…”
Section: K-nearest Neighborsmentioning
confidence: 99%
“…Akbulut et al [25] stated that the KNN method is considered to be one of the oldest and the simplest types of nonparametric classifier. A nonparametric model for KNN means that the classification of the test data does not use a function that has been set up in advance based on the learning process on the training dataset.…”
Section: K-nearest Neighborsmentioning
confidence: 99%
“…The basic theory behind the KNN is discovering a group of "k" samples (e.g., employing the distance functions) which has the nearest distance from unknown samples in the calibration dataset. Moreover, the KNN identifies the class of unknown samples among the "k" samples by calculating the average of the response variables [45]. Thus, the "k" plays an important role in the KNN performance [46].…”
Section: Models Developedmentioning
confidence: 99%
“…The KNN finds a group of k samples (e.g., using the distance functions) which are closest to unknown samples in the calibration dataset, constituting the basic theory of KNN. The KNN also determines the label (class) of unknown samples among the k samples through the calculation of the average of the response variables [61,62]. Consequently, k plays a significant role in the performance of the KNN [63].…”
Section: K-nearest Neighbor (Knn)mentioning
confidence: 99%