2020 Intermountain Engineering, Technology and Computing (IETC) 2020
DOI: 10.1109/ietc47856.2020.9249117
|View full text |Cite
|
Sign up to set email alerts
|

Chi-Squared Based Feature Selection for Stroke Prediction using AzureML

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 23 publications
(16 citation statements)
references
References 1 publication
0
15
0
1
Order By: Relevance
“…In terms of evaluation metrics, most studies used common evaluation metrics for risk prediction model, with C-statistics or area under ROC curve, sensitivity, specificity, and accuracy; while fewer reported precision, F-score. 21,22,26,27,32,34,36,39,41,42 A summary of the evaluation metrics is described in Supplemental Table 5. The boosting algorithm provided the best overall median C-statistic of 0.92 (IQR: 0.90-0.95), followed by SVM [median C-statistic= 0.85 (IQR: 0.74-0.94)] and NN [median C-statistic= 0.78 (IQR: 0.75-0.91)].…”
Section: Evaluation Methodsmentioning
confidence: 99%
“…In terms of evaluation metrics, most studies used common evaluation metrics for risk prediction model, with C-statistics or area under ROC curve, sensitivity, specificity, and accuracy; while fewer reported precision, F-score. 21,22,26,27,32,34,36,39,41,42 A summary of the evaluation metrics is described in Supplemental Table 5. The boosting algorithm provided the best overall median C-statistic of 0.92 (IQR: 0.90-0.95), followed by SVM [median C-statistic= 0.85 (IQR: 0.74-0.94)] and NN [median C-statistic= 0.78 (IQR: 0.75-0.91)].…”
Section: Evaluation Methodsmentioning
confidence: 99%
“…As a result, select the number of features based on the highest chi-squared scores. The chi-squared formula is presented below [16]:…”
Section: Chi-squaredmentioning
confidence: 99%
“…The chi-squared test was proposed [22] for feature selection to extract the top six features to train two ML models, that is, a two-class decision jungle model and a two-class boosted decision tree models. These two ML models were trained with all features and the top six features.…”
Section: Related Workmentioning
confidence: 99%