2015 Eighth International Conference on Advances in Pattern Recognition (ICAPR) 2015
DOI: 10.1109/icapr.2015.7050683
|View full text |Cite
|
Sign up to set email alerts
|

A probabilistic framework for dynamic k estimation in kNN classifiers with certainty factor

Abstract: Accuracy of the well-known k-nearest neighbor (kNN) classifier heavily depends on the choice of k. The problem of estimating a suitable k for any test point becomes difficult due to several factors like the local distribution of training points around that test point, presence of outliers in the dataset, and, dimensionality of the feature space. In this paper, we propose a dynamic k estimation algorithm based on the neighbor density function of a test point and class variance as well as certainty factor inform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…Eunomos is a legal information system which has been developed as a result of the ICT4LAW project [28]. It is intended to become a full-fledge legal document management system for legal firms, legal practitioners and legal scholars [29,30].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Eunomos is a legal information system which has been developed as a result of the ICT4LAW project [28]. It is intended to become a full-fledge legal document management system for legal firms, legal practitioners and legal scholars [29,30].…”
Section: Literature Reviewmentioning
confidence: 99%
“…The first matter is that KNN classification performance is affected by existing outliers, especially in small training sample-size situations [22]. This implies that one has to pay attention in selecting a suitable value for neighborhood size k [23]. Firstly, to overcome the influence of outliers, a local mean-based k nearest neighbor (LMKNN) classifier has been introduced in [3].…”
Section: Introductionmentioning
confidence: 99%