The reliable operation of power transformers is essential for grid stability, yet existing fault detection methods often suffer from inaccuracies and high false alarm rates. This study introduces a machine learning framework leveraging voltage signals for early fault detection. Simulating diverse fault conditions—including single line-to-ground, line-to-line, turn-to-ground, and turn-to-turn faults—on a laboratory-scale three-phase transformer, we evaluated decision trees, support vector machines, and logistic regression models on a dataset of 6000 samples. Decision trees emerged as the most effective, achieving 99.90% accuracy during 5-fold cross-validation and 95% accuracy on a separate test set of 400 unseen samples. Notably, the framework achieved a low false alarm rate of 0.47% on a separate 6000-sample healthy condition dataset. These results highlight the proposed method’s potential to provide a cost-effective, robust, and scalable solution for enhancing transformer fault detection and advancing grid reliability. This demonstrates the efficacy of voltage-based machine learning for transformer diagnostics, offering a practical and resource-efficient alternative to traditional methods.