The main focus of this work is the development of a software architecture to autonomously navigate a flying vehicle in an indoor environment in presence of obstacles. The hardware platform used to test the developed algorithms is the AscTec Firefly equipped with a RGB-D camera (Microsoft Kinect): the sensor output is used to incrementally build a map of the environment and generate a collision-free path. Specifically, we introduce a novel approach to analytically compute the path in an efficient and effective manner. An initial path, given by the intersection of two 3D surfaces, is shaped around the obstacles by adding to either of the two surfaces a radial function at every obstacle location. The intersection between the deformed surfaces is guaranteed not to intersect obstacles, hence it is a safe path for the robot to follow. The entire computation runs on-board and the path is computed in real-time. In this article we present the developed algorithms, the software architecture as well as the results of our experiments, showing that the method can adapt in real time the robot's path in order to avoid several types of obstacles, while producing a map of the surroundings.
Attention leads the gaze of the observer towards interesting items, allowing a detailed analysis only for selected regions of a scene. A robot can take advantage of the perceptual organisation of the features in the scene to guide its attention to better understand its environment. Current bottom–up attention models work with standard RGB cameras requiring a significant amount of time to detect the most salient item in a frame-based fashion. Event-driven cameras are an innovative technology to asynchronously detect contrast changes in the scene with a high temporal resolution and low latency. We propose a new neuromorphic pipeline exploiting the asynchronous output of the event-driven cameras to generate saliency maps of the scene. In an attempt to further decrease the latency, the neuromorphic attention model is implemented in a spiking neural network on SpiNNaker, a dedicated neuromorphic platform. The proposed implementation has been compared with its bio-inspired GPU counterpart, and it has been benchmarked against ground truth fixational maps. The system successfully detects items in the scene, producing saliency maps comparable with the GPU implementation. The asynchronous pipeline achieves an average of 16 ms latency to produce a usable saliency map.
Citation:Glover A, Vasco V, Iacono M and Bartolozzi C (2018) Event-driven (ED) cameras are an emerging technology that sample the visual signal based on changes in the signal magnitude, rather than at a fixed-rate over time. The change in paradigm results in a camera with a lower latency, that uses less power, has reduced bandwidth, and higher dynamic range. Such cameras offer many potential advantages for on-line, autonomous, robots; however, the sensor data do not directly integrate with current "image-based" frameworks and software libraries. The iCub robot uses Yet Another Robot Platform (YARP) as middleware to provide modular processing and connectivity to sensors and actuators. This paper introduces a library that incorporates an event-based framework into the YARP architecture, allowing event cameras to be used with the iCub (and other YARP-based) robots. We describe the philosophy and methods for structuring events to facilitate processing, while maintaining low-latency and real-time operation. We also describe several processing modules made available open-source, and three example demonstrations that can be run on the neuromorphic iCub.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.