The conjugate gradient method is a useful method to solve large-scale unconstrained optimisation problems and to be used in some applications in several fields such as engineering, medical science, image restorations, neural network, and many others. The main benefit of the conjugate gradient method is not using the second derivative or its approximation, such as Newton’s method or its approximation. Moreover, the algorithm of the conjugate gradient method is simple and easy to apply. This study proposes a new modified conjugate gradient method that contains four terms depending on popular two- and three-term conjugate gradient methods. The new algorithm satisfies the descent condition. In addition, the new CG algorithm possesses the convergence property. In the numerical results part, we compare the new algorithm with famous methods such as CG-Descent. We conclude from numerical results that the new algorithm is more efficient than other popular CG methods such as CG-Descent 6.8 in terms of number of function evaluations, number of gradient evaluations, number of iterations, and CPU time.