Feedforward neural network (FNN) emerges as a promising machine learning algorithm widely used for crucial tasks such as prediction, classification, clustering, regression, and etc. due to its excellent performance in training, learning and organizing data. Conventional approaches such as gradient-based algorithm is commonly used to train FNNs, but it has drawbacks of being trapped into local optima easily, slow convergence rate and high sensitivity to initial solutions generated. Metaheuristic search algorithm such as teaching-learning-based optimization (TLBO) is envisioned as a potential solution to train FNNs due to their stochastic nature and excellent global search ability. In this paper, a new TLBO variant named as teaching-learning-based optimization with modified learning phases (TLBO-MLPs) is designed as the training algorithm of FNN, aiming to optimize its neuron weights, biases and selection of activation functions by referring to the datasets of given classification problems. The classification performances of FNN trained by TLBO-MLPs are evaluated by using the real-life classification datasets from UCI Machine Learning Repository and compared with those optimized by the other well-established TLBO variants. Rigorous simulation studies show that the FNN trained by TLBO-MLPs outperforms most of its peer algorithms significantly in solving different classification problems. Particularly, the FNN classifiers optimized by proposed TLBO-MLPs are reported to outperform its competitors up to 15.72% and 29.07% in terms of classification accuracy for both training and testing datasets, respectively.