The efficacy of feed-forward multi-layer neural networks relies heavily on their training procedure, where identifying appropriate weights and biases plays a pivotal role. Nonetheless, conventional training algorithms such as backpropagation encounter limitations, including getting trapped in sub-optimal solutions. To rectify these inadequacies, metaheuristic population algorithms are advocated as a dependable alternative. In this paper, we introduce a novel training methodology termed, DDE-OP, which leverages the principles of differential evolution enriched with a division-based scheme and an opposite-direction strategy. Our approach integrates two effective concepts with differential evolution. Initially, the proposed algorithm identifies partitions within the search space through a clustering algorithm and designates the obtained cluster centres to serve as representatives. Subsequently, an updating scheme incorporates these clusters into the current population. Lastly, a quasi-opposite-direction strategy is used to augment search space exploration. Extensive evaluation on diverse classification and approximation tasks demonstrate that DDE-OP surpasses conventional and population-based methodologies.