The Barzilai-Borwein (BB) gradient method, computing the step size by imposing certain quasi-Newton property, is more efficient than the classic steepest descent (SD) method though it generally produces nonmonotone sequences. It is observed that the SD method adopts the Yuan stepsize is very competitive with the BB gradient method since the Yuan stepsize yields finite termination for two-dimensional strictly convex quadratic functions. In this paper, we investigate to accelerate the BB gradient method by imposing the above two-dimensional quadratic termination property. We introduce a novel equation and show that in the two-dimensional quadratic case it is equivalent to impose the gradient aligns with some eigendirection of the Hessian at next iteration, which will yield finite termination. Moreover, the Yuan stepsize can be recovered as a special solution of the equation. Based on the proposed equation, we derive a new stepsize for the BB gradient method. In addition to the aforementioned two-dimensional quadratic termination property, a remarkable feature of the new stepsize is that its computation only depends on the long and short BB stepsizes in two consecutive iterations, without the need for exact line search and Hessian. It thus has a great advantage of being easily extended