Intelligent unmanned systems have important applications, such as pesticide-spraying in agriculture, robot-based warehouse management systems, and missile-firing drones. The underlying assumption behind all autonomy is that the agent knows its relative position or egomotion with respect to some reference or scene. There exist thousands of localization systems in the literature. These localization systems use various combinations of sensors and algorithms, such as visual/visual-inertial SLAM, to achieve robust localization. The majority of the methods use one or more sensors from LIDAR, camera, IMU, UWB, GPS, compass, tracking system, etc. This survey presents a systematic review and analysis of published algorithms and techniques chronologically, and we introduce various highly impactful works. We provide insightful investigation and taxonomy on sensory data forming principle, feature association principle, egomotion estimation formation, and fusion model for each type of system. At last, some open problems and directions for future research are also included. We aim to survey the literature comprehensively to provide a complete understanding of localization methodologies, performance, advantages and limitations, and evaluations of various methods, shedding some light for future research.