Differences in acquisition time, light conditions, and viewing angle create significant differences among the airborne remote sensing images from Unmanned Aerial Vehicles (UAVs). Real-time scene matching navigation applications based on fixed reference maps are error-prone and have poor robustness. This paper presents a novel shadow-based matching method for the localization of low-altitude flight UAVs. A reference shadow map is generated from an accurate (0.5 m spatial resolution) Digital Surface Model (DSM) with the known date and time information; a robust shadow detection algorithm is employed to detect shadows in aerial images; the shadows can then be used as a stable feature for scene matching navigation. Combining the conventional intensity-based matching method, a fusion scene navigation scheme that is more robust to illumination variations is proposed. Experiments were performed with Google satellite maps, DSM data, and real aerial images of the Zurich region. The radial localization error of the Shadow-based Matching (SbM) is less than 7.3 m at flight height below 1200 m. The fusion navigation approach also achieves an optimal combination of shadow-based matching and intensity-based matching. This study shows the solution to the inconsistencies caused by changes in light, viewing angle, and acquisition time for accurate and effective scene matching navigation.