In this study a hybrid differential evolution-back-propagation algorithm to optimize the weights of feedforward neural network is proposed.The hybrid algorithm can achieve faster convergence speed with higher accuracy. The proposed hybrid algorithm combining differential evolution (DE) and back-propagation (BP) algorithm is referred to as DE-BP algorithm to train the weights of the feed-forward neural (FNN) network by exploiting global searching feature of the DE evolutionary algorithm and strong local searching ability of the BP algorithm. The DE has faster exploration property during initial stage of global search for the expense of convergence speed. On the contrary, the problem of random initialization of weights may lead to getting stuck at local minima of the gradient based BP algorithm. In the proposed hybrid algorithm, initially we use global searching ability of the DE to move towards global optimal solution in the search space for few generations by selecting good starting weights and then precise local gradient searching of the BP in that region to converge to the optimal solution with increased speed of convergence. The performance of proposed DE-BP is investigated on a couple of public domain datasets, the experimental results are compared with the BP algorithm, the DE evolutionary training algorithm and a hybrid real-coded GA with back-propagation (GA-BP) algorithm . The results show that the proposed hybrid DE-BP algorithm produce promising results in comparison with other training algorithms.