2005
DOI: 10.1007/10984697_8
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Discriminant and Quasiconformal Kernel Nearest Neighbor Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2008
2008
2013
2013

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 18 publications
0
7
0
Order By: Relevance
“…Many extensions of the basic method have been proposed over the years. It is possible to use kNN in conjunction with kernels [3], perform large margin learning [4], multi-label classification [5], adaptively determine the neighborhood size [6], etc.…”
Section: Introductionmentioning
confidence: 99%
“…Many extensions of the basic method have been proposed over the years. It is possible to use kNN in conjunction with kernels [3], perform large margin learning [4], multi-label classification [5], adaptively determine the neighborhood size [6], etc.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, (i) its performance highly depends on the selection of k; (ii) pooling nearest neighbors from training data that contain overlapping classes is considered unsuitable; (iii) the so-called curse of dimensionality can severely hurt its performance in finite samples [25,26]; and finally (iv) the selection of the distance metric is crucial to determine the outcome of the nearest neighbor classification [26].…”
Section: Introductionmentioning
confidence: 99%
“…Prior work in adaptive neighborhoods for k-NN has largely focused on locally adjusting the distance metric [11]- [20]. The rationale behind these adaptive metrics is that many feature spaces are not isotropic and the discriminability provided by each feature dimension is not constant throughout the space.…”
Section: Enclosing Neighborhoodsmentioning
confidence: 99%