2019
DOI: 10.1007/978-3-030-13469-3_36
|View full text |Cite
|
Sign up to set email alerts
|

A New Weighted k-Nearest Neighbor Algorithm Based on Newton’s Gravitational Force

Abstract: The kNN algorithm has three main advantages that make it appealing to the community: it is easy to understand, it regularly offers competitive performance and its structure can be easily tuning to adapting to the needs of researchers to achieve better results. One of the variations is weighting the instances based on their distance. In this paper we propose a weighting based on the Newton's gravitational force, so that a mass (or relevance) has to be assigned to each instance. We evaluated this idea in the kNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…Usually, the coordinates of the first K reference points are selected. e distance weighted K-Nearest Neighbor (DW-KNN) is based on the KNN [30], where the K nearest neighbors can be obtained by sorting according to formulae (5) and (6).…”
Section: Signal Distance Correctormentioning
confidence: 99%
“…Usually, the coordinates of the first K reference points are selected. e distance weighted K-Nearest Neighbor (DW-KNN) is based on the KNN [30], where the K nearest neighbors can be obtained by sorting according to formulae (5) and (6).…”
Section: Signal Distance Correctormentioning
confidence: 99%
“…(Parvinnia et al, 2014) also computed a weight for each training object based on a matching strategy. Respectively, (Aguilera et al, 2019) proposed a weighting based on Newton's gravitational force, so that a mass (or relevance) is to be assigned to each instance. Two methods of mass assignment is presented: circled by its own class (CC) and circled by different class (CD).…”
Section: Literature Reviewmentioning
confidence: 99%
“…Generally speaking, k-NN only needs one parameter to be adjusted, k, which represents how many closest neighbors are to be considered to classify an unseen object. Once this parameter is set, two main approaches are followed to classify an object, the vote of the majority of the k neighbors, and a weighted vote of all k neighbors considering the distance from where each of them is located concerning the object to classify [11] .…”
Section: K-nearest Neighbor (K-nn)mentioning
confidence: 99%