2021 International Conference on Big Data Analysis and Computer Science (BDACS) 2021
DOI: 10.1109/bdacs53596.2021.00056
|View full text |Cite
|
Sign up to set email alerts
|

k-Nearest Neighbor algorithm based on feature subspace

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 2 publications
0
1
0
Order By: Relevance
“… 11 The Classification Learner results for Q3 (Is there enough signal to distinguish the gut microbiome profiles of responders from non-responders before starting a KD? ), identified the model Ensemble Subspace k-nearest neighbours (kNN) 33 to have the highest estimated accuracy at 71.4%, compared with all available algorithmic models. Ensemble methods combine multiple classifiers, which provides a marked improvement in predictive accuracy and overall performance, particularly when less delineated characteristics exit in the dataset.…”
Section: Resultsmentioning
confidence: 99%
“… 11 The Classification Learner results for Q3 (Is there enough signal to distinguish the gut microbiome profiles of responders from non-responders before starting a KD? ), identified the model Ensemble Subspace k-nearest neighbours (kNN) 33 to have the highest estimated accuracy at 71.4%, compared with all available algorithmic models. Ensemble methods combine multiple classifiers, which provides a marked improvement in predictive accuracy and overall performance, particularly when less delineated characteristics exit in the dataset.…”
Section: Resultsmentioning
confidence: 99%
“…The bagged trees method implements decision trees using bootstrap aggregation (bagging) [35]. The subspace k-NN method uses the random subspace method for k-NN classification [36,37].…”
Section: Signature Classificationmentioning
confidence: 99%