Using vision for navigation of airborne systems provides an opportunity for motion relative to the ground to be controlled in the absence of other supporting sensors including global navigation satellite systems. Rather than relying on computationally intensive localization techniques such as online map construction, identification and tracking of landmarks, or otherwise producing an explicit quantitative estimate of position, we propose and have experimentally demonstrated a closed-loop visual navigation reflex which we term the optical ground course controller. The behavior is applicable to fixed wing aircraft traversing long ranges, and reduces the online computation and sensors required compared to other visual methods. This method combines the kinematics of fixed wing aircraft flight, the direction of apparent motion of an image sequence, and a magnetic compass to create a bioinspired optomotor reflex similar to those observed in insects. This behavior accurately controls track in the inertial reference frame (path taken over the ground) with only limited dependence on altitude, speed, and wind. We show that the proposed behavior is naturally convergent and stable, and present experimental results from simulation and real-world flight demonstrating that the method performs robustly, producing improvement over both magnetic-referenced and visual odometry-based navigation within the limits of the sensor. K E Y W O R D S embodied autonomy, optical flow, optomotor, UAV