2020
DOI: 10.1007/978-3-030-48791-1_26
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection with Artificial Bee Colony Algorithms for Classifying Parkinson’s Diseases

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…Many studies have used publicly available datasets from different repositories for PD diagnosis employing NI algorithm for feature selection and ML algorithms for classification. In this research study, the data set is collected from UCI ML Repository [65] which was also used in some previous studies mentioned in the literature section [43], [63]. The dataset is prepared from 252 patients whose speeches were recorded at the Department of Neurology at Istanbul University, where 188 (107 men and 81 women) were PD patients from the age 33 to 87 and 64 (23 men and 41 women) were healthy individuals from the age 41 to 82.…”
Section: Methodology a Datasetmentioning
confidence: 99%
See 1 more Smart Citation
“…Many studies have used publicly available datasets from different repositories for PD diagnosis employing NI algorithm for feature selection and ML algorithms for classification. In this research study, the data set is collected from UCI ML Repository [65] which was also used in some previous studies mentioned in the literature section [43], [63]. The dataset is prepared from 252 patients whose speeches were recorded at the Department of Neurology at Istanbul University, where 188 (107 men and 81 women) were PD patients from the age 33 to 87 and 64 (23 men and 41 women) were healthy individuals from the age 41 to 82.…”
Section: Methodology a Datasetmentioning
confidence: 99%
“…In addition, with OCFA, this study included MGWO for performance comparison, where the proposed MGWO selected the lowest number of features and highest accuracy. Later in the same year, Durgut et al [63] used binary versions of the artificial bee colony algorithm to reduce irrelevant features and deployed KNN for classification and SVM in the second stage to compare individual results. Dash et al [64] employed a chaotic firefly algorithm integrated with a kernel-based NB algorithm for discriminant features, and five classifiers were brought for classification.…”
mentioning
confidence: 99%
“…Attempts to characterise such search spaces faces increasing the computational complexity of most learning algorithms -for which the number of input features and sample size are critical parameters. To reduce space and computational complexities, the number of features of a given problem should be reduced (Durgut et al 2020). Many predictors benefit from the feature selection process since it reduces overfitting and improves accuracy, among other things (Chandrashekar and Sahin 2014).…”
Section: Related Workmentioning
confidence: 99%
“…As an alternative to the Euclidean function, Manhattan, Minkowski and Hamming functions can also be used. After the distance is calculated, it is sorted, and the incoming value is assigned to the appropriate class [29].…”
Section: K-nearest Neighbormentioning
confidence: 99%