2022
DOI: 10.1016/j.datak.2022.102095
|View full text |Cite
|
Sign up to set email alerts
|

A parameter-free KNN for rating prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…In scenarios with small-scale datasets where the distribution of signal data is irregular in space, it may be challenging for SVM to give a decision plane. In such cases, KNN [ 127 ], as an instance-based learning method, might offer a more straightforward solution. KNN classifies or regresses based on the majority class of the k-nearest neighbors around a new data point, making it less influenced by data distribution.…”
Section: ML For Tengsmentioning
confidence: 99%
“…In scenarios with small-scale datasets where the distribution of signal data is irregular in space, it may be challenging for SVM to give a decision plane. In such cases, KNN [ 127 ], as an instance-based learning method, might offer a more straightforward solution. KNN classifies or regresses based on the majority class of the k-nearest neighbors around a new data point, making it less influenced by data distribution.…”
Section: ML For Tengsmentioning
confidence: 99%
“…Several similar studies have been carried out, such as in [14] which utilizes a Mini-PC (Raspberry Pi 3) [27], [28] to retrieve data from OBD-II which will be processed using Machine Learning [29], [30] with K-Nearest Neighbour (KNN) [31]- [33] dan Naive Bayesian methods [34]- [36].…”
Section: Introductionmentioning
confidence: 99%
“…Junior Medjeu Fopa et al propose the parameter-free KNN method for rating prediction called freeKNN. It dynamically selected an appropriate number of neighbors depending on the user and the item to be rated [3]. Some researchers propose an improved -nearest neighbor algorithm denoted as Dk-NN, using dynamic instead of a single value of [4], and some researchers varying from 1, 3, 5, …, √ and use the majority voting of all classes to label an instance [2].…”
Section: Introductionmentioning
confidence: 99%