The adoption of electronic patient health records has paved the way for machine learning anddeep learning in disease diagnostics and prediction. Though traditionally tree-based algorithmshave performed well on structural data, neural networks are known to perform well onunstructured data and data with a large number of input features. Furthermore, transformer-based models such as TabTransformer have been shown to perform competitively with tree-basedalgorithms (Huang et al. 2020). In this paper, we compare TabTransformer’s performance withother state-of-art machine learning algorithms such as XGBoost, RandomForest, DecisionTree, andfeed-forward Multilayer Perceptron. We discovered that TabTransformer shows no significantimprovement over MLP and performs worse in certain metrics. Neither TabTransformer nor MLPperformed better than XGBoost, the best-performing algorithm for brain stroke prediction inKaggle competitions.