2017
DOI: 10.1051/matecconf/201714001024
|View full text |Cite
|
Sign up to set email alerts
|

The Performance Analysis of K-Nearest Neighbors (K-NN) Algorithm for Motor Imagery Classification Based on EEG Signal

Abstract: Abstract. Most EEG-based motor imagery classification research focuses on the feature extraction phase of machine learning, neglecting the crucial part for accurate classification which is the classification. In contrast, this paper concentrates on the classifier development where it thoroughly studies the performance analysis of k-Nearest Neighbour (k-NN) classifier on EEG data. In the literature, the Euclidean distance metric is routinely applied for EEG data classification. However, no thorough study has be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 33 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…KNN classification is a relatively simple clustering technique where a sample is classified by a plurality vote of its neighbors and assigned to the class based on the most common class among its k closest neighbors. 61 SVM finds a hyperplane that separates two classes with a high margin that maximizes the distances between nearest data points from each class. SVMs prove to be successful in nonlinear classification problems by mapping non-separable features into a higher dimensional space, a procedure known as the kernel trick which uses kernel functions such as Radial Basis Function (RBF) or polynomial.…”
Section: Ppg Representationsmentioning
confidence: 99%
“…KNN classification is a relatively simple clustering technique where a sample is classified by a plurality vote of its neighbors and assigned to the class based on the most common class among its k closest neighbors. 61 SVM finds a hyperplane that separates two classes with a high margin that maximizes the distances between nearest data points from each class. SVMs prove to be successful in nonlinear classification problems by mapping non-separable features into a higher dimensional space, a procedure known as the kernel trick which uses kernel functions such as Radial Basis Function (RBF) or polynomial.…”
Section: Ppg Representationsmentioning
confidence: 99%
“…65 K-Nearest Neighbours The k-NN algorithm is one of the most used classifiers in machine learning. 10,21,33 This algorithm consists in associating the training data with a distance function and the class choice function based on the classes of nearest neighbours. Before classifying a new subject, it should be compared with another subject using a similarity measure.…”
Section: Model Buildingmentioning
confidence: 99%
“…SVM [ 41 ] is a supervised machine learning model that uses classification algorithms for two-group classification problems [ 42 ]. The second classifier is KNN [ 43 ], representing a non-parametric machine learning method where the input consists of the k closest training examples in feature space, while the output depends on whether KNN is used for classification or regression [ 44 ]. Finally, TREE classifier [ 45 ] was used as a predictive modelling approach.…”
Section: Methodsmentioning
confidence: 99%