“…Feature selection serves to decrease the number of input variables to both reduce the computational cost of modeling and avoid overfitting. Previous studies performed feature selection with a combination of clinical and statistical judgment: initially selected clinical features were identified by neurologists with clinical expertise or based on related studies, feature engineering was then adopted by some studies to transform raw data (we will explore feature engineering in details in the Section 6); stepwise model building ( 19 , 25 , 27 , 29 , 34 , 39 ), univariate analysis ( 17 , 20 , 28 , 30 , 33 , 38 , 43 , 48 ), multivariable analysis using logistic regression ( 16 , 21 , 24 , 26 , 31 , 32 ), plots displaying the pattern of predictors, and outcome ( 21 ), and Least Absolute Shrinkage and Selection Operator (LASSO) ( 25 , 40 ), was performed to further select statistically significant features among initially selected features and new features generated in feature engineering.…”