1993
DOI: 10.1007/bf00993481
|View full text |Cite
|
Sign up to set email alerts
|

A weighted nearest neighbor algorithm for learning with symbolic features

Abstract: Abstract. In the past, nearest neighbor algorithms for learning from examples have worked best in domains in which all features had numeric values. In such domains, the examples can be treated as points and distance metrics can use standard defioJtions. In symbolic domains, a more sophisticated treatment of the feature space is required. We introduce a nearest neighbor algorithm for learning in domains with symbolic features. Our algorithm calculates distance tables that allow it to produce real-valued distanc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
308
0
9

Year Published

1996
1996
2022
2022

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 380 publications
(317 citation statements)
references
References 25 publications
0
308
0
9
Order By: Relevance
“…This measure is obviously less informative than its numeric counterpart, and, although it is appropriate in some cases, its use can lead to poor performance (Cost & Salzberg, 1993). A more sophisticated alternative consists of considering two symbolic values to be similar if they make similar predictions (i.e., if they correlate similarly with the class feature).…”
Section: Instance-based Learningmentioning
confidence: 99%
See 4 more Smart Citations
“…This measure is obviously less informative than its numeric counterpart, and, although it is appropriate in some cases, its use can lead to poor performance (Cost & Salzberg, 1993). A more sophisticated alternative consists of considering two symbolic values to be similar if they make similar predictions (i.e., if they correlate similarly with the class feature).…”
Section: Instance-based Learningmentioning
confidence: 99%
“…With multiple instances of each class, the frontier will be composed of a number of hyperplanar sections, and can thus become quite complex even when few instances are present (Aha et al, 1991). The introduction of weights further increases this complexity, turning the hyperplanes into hyperquadrics (Cost & Salzberg, 1993).…”
Section: Instance-based Learningmentioning
confidence: 99%
See 3 more Smart Citations