The global minimum search problem is important in neural networks because the error cost involved is formed as multiminima potential in weight parametric space. Therefore, parameters that produce a global minimum in a cost function are the best values for enhancing the performance of neural networks. Previously, a global minimum search based on a damped oscillator equation known as the heavy ball with friction (HBF) was studied. The kinetic energy overcomes a local minimum if the kinetic energy is sufficiently large or else the heavy ball will converge into a local minimum due to the action of friction. However, an appropriate damping coefficient has not been found in the HBF; therefore, the ball has to be shot again after it arrives at each local minimum until it finds a global minimum. In order to solve this problem, we determined an adaptive damping coefficient using the geodesic of Newtonian dynamics Lagrangian. This geometric method produces a second-order adaptively damped oscillator equation, the damping coefficient of which is the negative time derivative of the logarithmic function of the cost potential. Furthermore, we obtained a novel adaptive steepest descent by discretizing this second-order equation. To investigate the performance of this novel steepest descent, we applied our first-order update rule to the Rosenbrock-and Griewank-type potentials. The results show that our method determined the global minimum in most cases from various initial points. Our adaptive steepest descent may be applied in