2019 International Conference on Intelligent Computing and Control Systems (ICCS) 2019
DOI: 10.1109/iccs45141.2019.9065563
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study for Breast Cancer Prediction using Machine Learning and Feature Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(8 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…NN's versatility due to its multiple variations such as in Zhang et al [29], Mohammed et al [30], and Higa [31], is a great choice for hyper-parameterization. Similar to SVM, LR and NN models showed robustness with BCWD in the studies of Utomo et al [16], Dhanya et al [18], Omondiagbe et al [19], Gupta and Garg [20] and Balaraman [21], making SVM, LR and NN, with rigorous examination, as the preferred classification model of the authors.…”
Section: Literature Analysismentioning
confidence: 95%
See 1 more Smart Citation
“…NN's versatility due to its multiple variations such as in Zhang et al [29], Mohammed et al [30], and Higa [31], is a great choice for hyper-parameterization. Similar to SVM, LR and NN models showed robustness with BCWD in the studies of Utomo et al [16], Dhanya et al [18], Omondiagbe et al [19], Gupta and Garg [20] and Balaraman [21], making SVM, LR and NN, with rigorous examination, as the preferred classification model of the authors.…”
Section: Literature Analysismentioning
confidence: 95%
“…SVM proved to have the highest accuracy (98.1%) and Receiver Operating Characteristic (ROC) curve for both classifications. Dhanya et al[18] showed the differences in the performance of LR, naïve Bayes and random forest. This study also employed various methods of feature selection, such as sequential feature Selection, Recursive Feature Elimination (RFE), f-test and correlation.…”
mentioning
confidence: 99%
“…Sequential forward and backward feature selection [23] and metaheuristic algorithms such as genetic algorithm [24] and particle swarm optimization [25] have been extensively used as wrapper algorithm for BC prediction. Dhanya et al [26] compared the FS and ML algorithms for BC prediction. The experiments were conducted on WBCD.…”
Section: Literature Surveymentioning
confidence: 99%
“…Greater values of the f-test indicate smaller distances within the groups and greater distances between the groups. In this ANOVA feature selection method using the f-test, the features are ranked according to their f-score [26].…”
Section: B Feature Selectionmentioning
confidence: 99%