We derive a direct connection between Nesterov's accelerated first-order algorithm and the Halpern fixed-point iteration scheme for approximating a solution of a co-coercive equation. We show that one method is equivalent to another via a simple transformation, leading to a simple convergence proof for Nesterov's accelerated scheme. Alternatively, we directly establish convergence rates of Nesterov's accelerated variant, and as a consequence, we obtain a new convergence rate of the Halpern fixed-point iteration. Next, we apply our results to different methods such as proximal-point, forward-backward splitting, and three-operator splitting methods for solving monotone inclusions, where our convergence guarantees are applicable. Since the forward scheme requires the co-coerciveness of the underlying operator, we derive new Nesterov's accelerated variants for both recent extra-anchored gradient and past-extra anchored gradient methods which can avoid the co-coerciveness assumption. For the extra-anchored gradient method, we consider two variants studied in [43] for the monotone case and in [21] for the co-monotone case. The latter covers the former as a special case and has a tighter convergence rate. For the pastextra anchored gradient method (an accelerated variant of the optimistic gradient method)[41], we obtain a new Nesterov's accelerated variant with two past-iterate correction terms that may help to develop new Nesterov's accelerated methods for minimax problems and their continuous views. We test our theoretical results on two numerical examples, where the actual convergence rates match well the theoretical rates up to a constant factor.