This paper discusses the progress made on developing a multi-rotor helicopter equipped with a vision-based ability to navigate through an a priori unknown, GPS-denied environment. We highlight the backbone of our system, the relative estimation and control. We depart from the common practice of using a globally referenced map, preferring instead to keep the position and yaw states in the EKF relative to the current map node. This relative navigation approach allows simple application of sensor updates, natural characterization of the transformation between map nodes, and the potential to generate a globally consistent map when desired. The EKF fuses view matching data from a Microsoft Kinect with more frequent IMU data to provide state estimates at rates high enough to control the vehicles fast dynamics. Although an EKF is used, a nodes and edges graph represents the map. Hardware results showing the quality of the estimates and flights with estimates in the loop are provided.