In the robotics field, one of the main challenges is autonomous navigation using a single camera where camera images are processed in a frame to frame basis. However, getting clear and noise-free images is still a challenge under erratic motion, typical of moving robots. To solve this problem, several works use optical flow techniques to eliminate blurred reference points in RGB images when the robot moves. The NAO robot is an example of a robotic platform that generates an oscillatory movement when walking, thus producing blurred images that may compromise the image processing task. In this work, we focus on the problem of depth estimation in a single image for the NAO robot, which proves useful for autonomous navigation. For the depth estimation, we argue that the erratic movement exhibited by the walking motion of the robot could be exploited to obtain optical flow vectors, which are strongly related to depth observed by the NAO's camera. Thus, we present a real-time system based on a Convolutional Neural Network (CNN) architecture that uses optical flow as input channels in order to estimate depth. To this aim, we present a new dataset that includes optical flow images associated to depth images for training. Our results indicate that optical flow can be exploited in humanoid robots such as NAO, but we are confident that it could be used in other platforms with erratic motion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.