2022
DOI: 10.1016/j.aej.2021.09.004
|View full text |Cite
|
Sign up to set email alerts
|

Improved KNN algorithms of spherical regions based on clustering and region division

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(10 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…The learning algorithm chosen for the data-modeling phase with the aim of detecting seizures was -nearest neighbor ( - ) [ 27 ]. - is one of the most used machine learning techniques for data mining, showing very high performance in many applications [ 28 ], such as in satellite scene classification, handwritten digit identification, fraud detection, ECG-pattern discovery, and in the detection of new COVID-19 [ 29 ] from human genome sequences. - is often successful where the classes are not linearly separable because the decision boundary is very irregular [ 30 ].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The learning algorithm chosen for the data-modeling phase with the aim of detecting seizures was -nearest neighbor ( - ) [ 27 ]. - is one of the most used machine learning techniques for data mining, showing very high performance in many applications [ 28 ], such as in satellite scene classification, handwritten digit identification, fraud detection, ECG-pattern discovery, and in the detection of new COVID-19 [ 29 ] from human genome sequences. - is often successful where the classes are not linearly separable because the decision boundary is very irregular [ 30 ].…”
Section: Methodsmentioning
confidence: 99%
“…First, a 50 Hz notch filter was applied to EEG data in order to remove line noise due to signal acquisition and digital conversion. Afterwards, a band-pass filter was applied on the noise-free EEG data in order to extract the signal content in six different frequency bands corresponding to well-known oscillations of brain activity, namely, (8-13) Hz (α), (13-21) Hz (β1), (21)(22)(23)(24)(25)(26)(27)(28)(29)(30) Hz (β2), (30)(31)(32)(33)(34)(35)(36)(37)(38)(39)(40) Hz (low 𝛾), (40-70) Hz (medium 𝛾), and (70-120) Hz (high 𝛾). Finally, a sliding window paradigm was applied to the noise-free, band-passed EEG data by setting two temporal parameters: 𝐿, which represents the time length of the analysis window, and 𝑆, which is the time shift of the window which slides on the signal.…”
Section: Eeg Preprocessingmentioning
confidence: 99%
“…KNN (K-nearest neighbor algorithm) classification algorithm is a learning method, which can work with no parametric based on distance metrics [14][15][16]. It is a suitable choice for data categorization because of its basic principle and minimal impacting elements.…”
Section: Machine Learning Based Methodsmentioning
confidence: 99%
“…However, choosing K value is difficult owing to a lack of data pre-processing. In general, an odd integer is used for the K value because selecting an odd number avoids selection ambiguity when the K value is even [13,14]. Its benefits include easy and effective computation, cheap retraining costs, and suitability for automatic class field categorization with high sample sizes [14].…”
Section: Machine Learning Based Methodsmentioning
confidence: 99%
“…The algorithm has the advantages of simplicity, effectiveness, no parameter estimation and low complexity. The standard Euclidean distance will become inaccurate and consume a lot of computation time because the test samples need to be calculated with all the attributes of the samples when the algorithm is running, however, the attributes often contain unrelated attributes or attributes with low correlation(H. Wang, Xu, & Zhao, 2022).…”
Section: Classification Modelmentioning
confidence: 99%