This paper presents a framework for navigating in obstacle-dense environments as posed in the 2016 International Conference on Intelligent Robots and Systems (IROS) Autonomous Drone Racing Challenge. Our framework is based on direct visual servoing and leg-by-leg planning to navigate in a complex environment filled with many similar frame-shaped obstacles to fly through.Our indoor navigation method relies on the velocity measurement by an optical flow sensor since the position measurements from GPS or external cameras are not available. For precision navigation through a sequence of obstacles, a center point-matching method is used with the depth information from the onboard stereo camera. The guidance points is directly generated in threedimensional space using the two-dimensional image data to avoid accumulating the error from the sensor drift. The proposed framework is implemented on a quadrotor-based aerial vehicle, which carries an onboard vision-processing computer for self-contained operation. Using the proposed method, our drone was able to finished in first place in the world-premier IROS Autonomous Drone Racing Challenge.