Proposing new strategies to improve the optimization performance of differential evolution (DE) is an important research study. The jSO algorithm was the announced winner of the Congress on Evolutionary Computation (CEC) 2017 competition on numerical optimization, and is the state-of-the-art algorithm in the SHADE (Success-History based Adaptive Differential Evolution) algorithm series. However, the jSO algorithm converges prematurely in the search space with different dimensions and is prone to falling into local optimum during evolution, as well as the problem of decreasing population diversity. In this paper, a modified jSO algorithm (MjSO) is proposed which is based on cosine similarity with parameter adaptation and a novel opposition-based learning restart mechanism incorporated with symmetry to address the above problems, respectively. Moreover, it is well known that parameter setting has a significant impact on the performance of the algorithm and the search process can be divided into two symmetrical parts. Hence, a parameter control strategy based on a symmetric search process is introduced in the MjSO. The effectiveness of these designs is supported by presenting a population clustering analysis, along with a population diversity measure to evaluate the performance of the proposed algorithm, three state-of-the-art DE variant algorithms (EBLSHADE, ELSHADE-SPACMA, and SALSHADE-cnEPSin) and two original algorithms (jSO and LSHADE) are compared with it, for solving 30 CEC’17 benchmark functions and three classical engineering design problems. The experimental results and analysis reveal that the proposed algorithm can outperform other competitions in terms of the convergence speed and the quality of solutions. Promisingly, the proposed method can be treated as an effective and efficient auxiliary tool for more complex optimization models and scenarios.