This article presents a real-time localization method for Unmanned Aerial Vehicles (UAVs) based on continuous image processing. The proposed method employs the Scale Invariant Feature Transform (SIFT) algorithm to identify key points in multi-scale space and generate descriptor vectors to match identical objects across multiple images. These corresponding points in the image provide pixel positions, which can be combined with transformation equations, allow for the calculation of the UAV's actual ground position. Additionally, the physical coordinates of matching points in the image can be obtained, corresponding to the UAV's physical coordinates. The method achieves real-time positioning and tracking during UAV flight, with experimental results demonstrating that within an acceptable error range, the UAV coordinates calculated using the proposed algorithm are consistent with the actual coordinates. The maximum error along the x-axis, y-axis, and z-axis is 4.501 cm, with the horizontal error exhibiting high stationarity and the vertical error having a low average value of 0.041 cm. The real-time positioning algorithm presented in this article possesses characteristics such as simplicity, ease of implementation, and low error, making it suitable for UAVs with limited computational processing power.