We consider an iterative computation of negative curvature directions, in large-scale unconstrained optimization frameworks, needed for ensuring the convergence toward stationary points which satisfy second-order necessary optimality conditions. We show that to the latter purpose, we can fruitfully couple the conjugate gradient (CG) method with a recently introduced approach involving the use of the numeral called Grossone. In particular, recalling that in principle the CG method is well posed only when solving positive definite linear systems, our proposal exploits the use of grossone to enhance the performance of the CG, allowing the computation of negative curvature directions in the indefinite case, too. Our overall method could be used to significantly generalize the theory in state-of-the-art literature. Moreover, it straightforwardly allows the solution of Newton's equation in optimization frameworks, even in nonconvex problems. We remark that our iterative procedure to compute a negative curvature direction does not require the storage of any matrix, simply needing to store a couple of vectors. This definitely represents an advance with respect to current results in the literature.