Sensors commonly mounted on small unmanned ground vehicles (UGVs) include visible light and thermal cameras, scanning LIDAR, and ranging sonar. Sensor data from these sensors is vital to emerging autonomous robotic behaviors. However, sensor data from any given sensor can become noisy or erroneous under a range of conditions, reducing the reliability of autonomous operations. We seek to increase this reliability through data fusion. Data fusion includes characterizing the strengths and weaknesses of each sensor modality and combining their data in a way such that the result of the data fusion provides more accurate data than any single sensor. We describe data fusion efforts applied to two autonomous behaviors: leader-follower and human presence detection. The behaviors are implemented and tested in a variety of realistic conditions. KEYWORDS: robotics, unmanned systems, data fusion, intelligent behaviors, computer vision 1. BACKGROUND
Technology Transfer ProjectThe JGRE Technology Transfer Project (TechTXFR) managed by Space and Naval Warfare Systems Center, San Diego (SSC San Diego) seeks to enhance the functionality (ability to perform more tasks) and autonomy (with less human intervention) of teleoperated systems 1 . The objective is to expedite advancement of the technologies needed to produce an autonomous robot that can robustly perform in battlefield situations. Instead of developing new capabilities from scratch, the approach is to assess the technology readiness levels (TRLs) of component technologies (i.e., mapping, object recognition, motion-detection-on-the-move) developed under a variety of past and ongoing R&D efforts (such as the DARPA Tactical Mobile Robot program). The most mature algorithms are integrated and optimized into cohesive behavior architectures and then ported to various platforms used by the warfighter for further evaluation in operational environments.
Contributing sources of component technologies include the Idaho National Laboratory (INL), NASA's Jet PropulsionLaboratory, Carnegie-Mellon University (CMU), Stanford Research Institute International (SRI), University of Michigan, Brigham Young University, University of California San Diego, and University of Texas Austin, as well as other SSC San Diego projects (e.g., Man Portable Robotic System 2 and the ROBART series 3 ). Starting in FY-03, the approach was to harvest existing indoor navigation technologies developed by various players and assess their different approaches to dead reckoning, obstacle detection/avoidance, mapping, localization, and path planning. The details of these focus areas will not be discussed in this paper but can be found in previous project publications 4 . The best features of the more promising solutions have now been integrated into an optimal system, giving an operator the ability to send an autonomous platform into an unknown indoor area and accurately map the surroundings. An augmented virtuality representation of the environment is derived, fusing real-time sensor information with the evo...