2021
DOI: 10.3390/e23121568
|View full text |Cite
|
Sign up to set email alerts
|

Neural Network Used for the Fusion of Predictions Obtained by the K-Nearest Neighbors Algorithm Based on Independent Data Sources

Abstract: The article concerns the problem of classification based on independent data sets—local decision tables. The aim of the paper is to propose a classification model for dispersed data using a modified k-nearest neighbors algorithm and a neural network. A neural network, more specifically a multilayer perceptron, is used to combine the prediction results obtained based on local tables. Prediction results are stored in the measurement level and generated using a modified k-nearest neighbors algorithm. The task of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…The relative distance between instances is more important than their absolute position within a given region [ 19 ]. The k -NN algorithm is suitable for analyzing large, multidimensional datasets [ 41 , 44 ], and is the optimal method when prior knowledge of the data distribution is lacking [ 17 , 45 ]. Furthermore, there is no requirement for off-line training when using the k -NN algorithm, so it is also time efficient [ 14 ].…”
Section: Discussionmentioning
confidence: 99%
“…The relative distance between instances is more important than their absolute position within a given region [ 19 ]. The k -NN algorithm is suitable for analyzing large, multidimensional datasets [ 41 , 44 ], and is the optimal method when prior knowledge of the data distribution is lacking [ 17 , 45 ]. Furthermore, there is no requirement for off-line training when using the k -NN algorithm, so it is also time efficient [ 14 ].…”
Section: Discussionmentioning
confidence: 99%
“…Evidence theory, also known as Dempster-Shafer evidence theory [28,29], is able to distinguish between 'uncertainty' and 'not knowing' and is able to deal with uncertainty arising from 'not knowing' with a high degree of flexibility, and thus has gained wide application in uncertainty inference and data fusion [30,31].…”
Section: Introduction To Ds Evidence Theorymentioning
confidence: 99%
“…Neural networks have been considered for dispersed data in various applications. The papers [9,10] considered neural networks as a model for aggregating prediction vectors generated by local classifiers. In the paper [11], neural networks were used in a federated learning approach.…”
Section: Introductionmentioning
confidence: 99%