This paper presents the implementation of a mutual-aided navigation system for an aerial vehicle. Employing all available sensors in navigation is effective at maintaining continuous and optimal results. The images offer a lot of information about the surrounding environment, but image processing is time-consuming and causes timing problems. While traditional fusion algorithms tend to reduce the delay errors or ignore them, this research depends on state estimation recalculation during the delay time and on sequential filtering. To reduce the image matching time, the map is processed offline, then key point clusters are stored to avoid feature recalculation online. The sensors’ information is used to bound the search space for the matched features on the map, then they are reprojected on the captured images to exclude the unuseful part from processing. The suggested mutual-aided form compensates for the inertial system drift, which enhances the system’s accuracy and independence. The system was tested using data collected from a real flight using a DJI drone. The measurements from an inertial measurement unit (IMU), camera, barometer, and magnetometer were fused using a sequential Kalman Filter. The final results prove the efficiency of the suggested system to navigate with high independency, with an RMS position error of less than 3.5 m.