2018
DOI: 10.1016/j.patrec.2018.03.021
|View full text |Cite|
|
Sign up to set email alerts
|

Combining Minkowski and Chebyshev: New distance proposal and survey of distance metrics using k-nearest neighbours classifier

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
26
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 67 publications
(27 citation statements)
references
References 7 publications
0
26
0
1
Order By: Relevance
“…When w 1 is bigger than w 2 , this distance is similar to Minkowski distance, and vice versa. The Minkowski-Chebyshev distance is defined as [23]:…”
Section: E Mikowski-chebyshev Distancementioning
confidence: 99%
“…When w 1 is bigger than w 2 , this distance is similar to Minkowski distance, and vice versa. The Minkowski-Chebyshev distance is defined as [23]:…”
Section: E Mikowski-chebyshev Distancementioning
confidence: 99%
“…Moreover, in [16], a method to reduce the number of points for line approximation was proposed. One of the latest articles in this area by Rodrigues [17] offers a method for calculating distances based on the combination of the Minkowski and Chebyshev distances. Nowadays, this method is widely used in computer vision applications [18,19].…”
Section: Related Workmentioning
confidence: 99%
“…The first parameter is the proximity measure that defines the closest instances. The second one is the k variable representing an upper limit for the number of instances that will be handled [13]. Distance metrics are generally used as proximity measure.…”
Section: Introductionmentioning
confidence: 99%
“…They also stated that there is no optimal distance measure appropriate for all datasets and each dataset supported a certain distance measure. Rodrigues [13] proposed a new distance that combines Minkowski and Chebychev distances. To evaluate the efficiency of the proposed distance using kNN an experiment was performed using 33 datasets from the UCI.…”
Section: Introductionmentioning
confidence: 99%