Abstract-In this paper we present a step-planner embedded in a framework which enables a humanoid robot to navigate among obstacles, exploiting its overall capacities. The system allows the robot to react to changes of user input in real-time while walking at reasonable speeds. The proposed method relies neither on external sensors nor on color coding or textured surfaces. The key idea is to use a fast collision model based on swept-sphere-volumes (SSVs) for real-time generation of collision-free footsteps and whole-body trajectories. Using a SSV-based 3D approximation in all control modules enables the robot to avoid collisions with itself and the environment. Obstacles are detected with an on-board RGB-D sensor while the robot navigates through an environment which is not known in advance. A step-planner reacts to high-level user commands like desired velocity and direction within less than a step. Instead of investigating only the footholds an articulated 3D approximation of the lower leg and the foot is considered to find feasible and optimal footstep locations. Additionally, it provides an initial solution for the swing-foot movement. Final, Collision-free swing-foot trajectories are created in real-time in the feedback control layer using all the foot's degrees of freedom. We validated this approach in experiments with our robot Lola.