In this paper, based on a new class of conjugate gradient methods which are proposed by Rivaie, Dai and Omer et al. we propose a class of improved conjugate gradient methods for nonconvex unconstrained optimization. Different from the above methods, our methods possess the following properties: (i) the search direction always satisfies the sufficient descent condition independent of any line search; (ii) these approaches are globally convergent with the standard Wolfe line search or standard Armijo line search without any convexity assumption. Moreover, our numerical results also demonstrated the efficiencies of the proposed methods.