2021
DOI: 10.21831/jeatech.v2i2.42777
|View full text |Cite
|
Sign up to set email alerts
|

K-Nearest Neighbor (K-NN) algorithm with Euclidean and Manhattan in classification of student graduation

Abstract: K-Nearest Neighbor (K-NN) algorithm is a classification algorithm that has been proven to solve various classification problems. Two approaches that can be used in this algorithm are K-NN with Euclidean and K-NN with Manhattan. The research aims to apply the K-NN algorithm with Euclidean and K-NN with Manhattan to classify the accuracy of graduation. Student graduation is determined by the variables of gender, major, number of first-semester credits, number of second-semester credits, number of third-semester … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
1
3

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 27 publications
(11 citation statements)
references
References 10 publications
0
7
1
3
Order By: Relevance
“…In this study, the indigo k used was 3 with a resulting accuracy rate of 92.07%, while the k value of less than 3 and more than 3 accuracy levels produced was less than 92.07%. The results of this study are different from the results of research from Hidayati & Hermawan (2021) which showed that the maximum k value with the highest accuracy was 7 with an accuracy rate of 85.25%. However, the difference in results is more due to the use of different measure types and mixed measures.…”
Section: Discussioncontrasting
confidence: 99%
“…In this study, the indigo k used was 3 with a resulting accuracy rate of 92.07%, while the k value of less than 3 and more than 3 accuracy levels produced was less than 92.07%. The results of this study are different from the results of research from Hidayati & Hermawan (2021) which showed that the maximum k value with the highest accuracy was 7 with an accuracy rate of 85.25%. However, the difference in results is more due to the use of different measure types and mixed measures.…”
Section: Discussioncontrasting
confidence: 99%
“…This algorithm was executed through the Rapidminer software, and it was tested on a dataset comprising 380 training data points and 163 test data points. The findings reveal that employing the Euclidean and Manhattan distance methods for classifying graduating students yielded the highest accuracy rate of 85.28% when the value of k was set to 7 [25]. Research Application of Data Mining in Classifying Student Data Based on Academic Data Before College and Study Period Using the K-Medoids Method with the variables average math scores [26].…”
Section: Related Researchmentioning
confidence: 99%
“…i = Variabel data. Algoritma k-NN nilai k dapat berupa konstanta yang ditentukan atau variabel acak, dimana metrik jarak yang digunakan dapat berupa jarak Euclidean, Minkowski, Manhattan, Chebychev, Korelasi, dan lain-lain [26].…”
Section: ) K-nnunclassified