Humanoid robots are expected to work in the human environment due to their similarity to the human shape, and they are required to achieve localization and navigation autonomously. In this paper, indoor navigation is realized based on a view-based approach using a camera mounted on the head. In the case of a humanoid robot, image blur and swing due to walking is a crucial issue for image matching during localization. The quantitative effect of walking on the image is firstly investigated by utilizing a motion capture system. Then a method to generate a stable view sequence is proposed based on the detection of optical flows. The navigation function with the proposed method was implemented on a humanoid robot HRP-2, and the effectiveness was confirmed by indoor walking experiments.