In this paper we propose efficient new linesearch algorithms for solving large scale unconstrained optimization problems which exploit any local nonconvexity of the objective function. Current algorithms in this class typically compute a pair of search directions at every iteration: a Newton-type direction, which ensures both global and fast asymptotic convergence, and a negative curvature direction, which enables the iterates to escape from the region of local non-convexity. A new point is generated by performing a search along a line or a curve obtained by combining these two directions. However, in almost all if these algorithms, the relative scaling of the directions is not taken into account.We propose a new algorithm which accounts for the relative scaling of the two directions. To do this, only the most promising of the two directions is selected at any given iteration, and a linesearch is performed along the chosen direction. The appropriate direction is selected by estimating the rate of decrease of the quadratic model of the objective function in both candidate directions. We prove global convergence to second-order critical points for the new algorithm, and report some preliminary numerical results.