The classic 2D SLAM are not good enough in nowadays environment. This report uses virtual machine with Ubuntu based Slam_bot package, based on the RTAB-MAP algorithm and Vision SLAM mapping to simulate the four-wheeled robot to autonomously navigate to the target point in various environments. Also, this report introduces a RGBD-SLAM based algorithm which combines the visual and depth data to process the data collect from the sensors. This robot has many sensors like, lidar sensors, RGB vision camera and odometry sensors. To see how the RTAB-MAP algorithm with RGB-D sensor replace for the 2D SALM. As results, the robot with RGB-D and RTAB-MAP algorithms have very good performance. The results show that the navigation system can complete the navigation and localization in lots of complex situations. However, some problems still exist, the speed of the robot is not fast. This may limit the application of the self-navigation robot to a certain extent, like some emergency occasion.