2015 IEEE International Conference on Information and Automation 2015
DOI: 10.1109/icinfa.2015.7279570
|View full text |Cite
|
Sign up to set email alerts
|

Weigted-KNN and its application on UCI

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 1 publication
0
7
0
Order By: Relevance
“…KNN is also known as a lazy learner because in its training phase it simply stores all the tuples or do less calculation on the training dataset and awaits until a test tuple is given to it [1] [4].When a test tuple is given, it calculates the similarity between the test tuple and all the training tuples by using distance metrics. the best distance metrics are used like Manhattan distance, Minkowski distance, Mahalanobis distance, Chebyshev distance, and the most commonly used distance metric used in KNN is Euclidean distance metric which gives better results as compared to other distance metrics [5] [6]. The Euclidean distance metric is used when the attributes are not strongly correlated [5] [6].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…KNN is also known as a lazy learner because in its training phase it simply stores all the tuples or do less calculation on the training dataset and awaits until a test tuple is given to it [1] [4].When a test tuple is given, it calculates the similarity between the test tuple and all the training tuples by using distance metrics. the best distance metrics are used like Manhattan distance, Minkowski distance, Mahalanobis distance, Chebyshev distance, and the most commonly used distance metric used in KNN is Euclidean distance metric which gives better results as compared to other distance metrics [5] [6]. The Euclidean distance metric is used when the attributes are not strongly correlated [5] [6].…”
Section: Introductionmentioning
confidence: 99%
“…the best distance metrics are used like Manhattan distance, Minkowski distance, Mahalanobis distance, Chebyshev distance, and the most commonly used distance metric used in KNN is Euclidean distance metric which gives better results as compared to other distance metrics [5] [6]. The Euclidean distance metric is used when the attributes are not strongly correlated [5] [6]. Let x is a sample whose class label is unknown and y is a set of n training tuples [7] then the Euclidean distance ( , ) is calculated using the formula :…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Li et al proposed an attribute weighting method where the unrelated attributes are reduced and weights are assign to each attribute by using the method of sensitivity which increase the efficiency of algorithm [15]. A new algorithm based on dynamic weighting to enhance the classification accuracy of the KNN algorithm was devised by K Maryam.…”
Section: International Journal Of Computer Applications (0975 -8887) mentioning
confidence: 99%
“…Xiao and Ding suggested a weighting method based on weighted entropy of attribute value which enhance the accuracy of classification [16]. Li et al proposed an attribute weighting method where the irrelevant attributes are reduced and weight is assigned by the method of sensitivity which improves the efficiency of algorithm [17]. A Distanceweighted K-Nearest-Neighbor rule developed by Lan Du et al using the dual distance-weighted function [18].…”
Section: ©Ijraset (Ugc Approved Journal): All Rights Are Reservedmentioning
confidence: 99%