With tiny and limited nervous systems, insects demonstrate a remarkable ability to fly through complex environments. Optic flow has been identified to play a crucial role in regulating flight conditions and navigation in flies and bees. In robotics, optic flow has been widely studied thanks to the low computational requirements. However, with only monocular visual information, optic flow is inherently devoid of a scale factor required for estimating the absolute distance. In this paper, we propose a strategy for estimating the flight altitude of a flying robot with a ventral camera by combining the optic flow with measurements from an inertial measurement unit. Instead of using the prevalent feature-based approach for calculation of optic flow, we implement a direct method that evaluates the flow information via image gradients. We show that the direct approach notably simplifies the computation steps compared to the feature-based method. When combined with an extended Kalman filter for fusion of inertial measurement units measurements, the flight altitude can be estimated in real time. We carried out extensive flight tests in different settings. Among 31 hovering and vertical flights near the altitude of 40 cm, we achieved the RMS errors in the altitude estimate of 2.51 cm. Further analysis of factors that affect the quality of the flow and the distance estimate is also provided. PAPER Original content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence.