Safety and robust operation of autonomous vehicles is a pertinent concern for both defense and also civilian systems. From self-driving cars to autonomous Navy vessels and Army vehicles, malfunctions can have devastating consequences, including losses of life and infrastructure. Autonomous ground vehicles use a variety of sensors to image their environment: passive sensors such as RGB and thermal cameras, and active sensors such as RGBD, LIDAR, radar, and sonar. These sensors are either used alone or fused to accomplish the basic mobile autonomy tasks: obstacle avoidance, localization, mapping, and, subsequently, path-planning. In this paper, we will provide a qualitative and quantitative analysis of the limitations of ROS mapping algorithms when depth sensors-LIDAR and RGBD-are degraded or obscured, e.g., by dust, heavy rain, snow, or other noise. Aspects that will be investigated include form of the degradation and effect on the autonomous operations. This work will summarize the limitations of publicly available ROS mapping algorithms when depth sensors are degraded or obscured. Furthermore, this work will lay a foundation for developing robust autonomy algorithms that are resilient to possible degraded or obscured sensors.