Recently, the emerging bio-inspired event cameras have demonstrated potentials for a wide range of robotic applications in dynamic environments. In this paper, we propose a novel fast and asynchronous event-based corner detection method which is called FA-Harris. FA-Harris consists of several components, including an event filter, a Global Surface of Active Events (G-SAE) maintaining unit, a corner candidate selecting unit, and a corner candidate refining unit. The proposed G-SAE maintenance algorithm and corner candidate selection algorithm greatly enhance the real-time performance for corner detection, while the corner candidate refinement algorithm maintains the accuracy of performance by using an improved event-based Harris detector. Additionally, FA-Harris does not require artificially synthesized event-frames and can operate on asynchronous events directly. We implement the proposed method in C++ and evaluate it on public Event Camera Datasets. The results show that our method achieves approximately 8× speed-up when compared with previously reported event-based Harris detector, and with no compromise on the accuracy of performance.
As a novel vision sensor, the dynamic and active-pixel vision sensor (DAVIS) combines a standard camera and an asynchronous event-based sensor in the same pixel array. In this paper, we propose a novel asynchronous feature tracking method based on line segments with the DAVIS. The proposed method takes asynchronous events, synchronous image frames, and IMU data as the input. We first use the Harris detector to extract feature points and the Canny detector to extract line segment templates from image frames. Then we select spatio-temporal windows from asynchronous events and perform registration to estimate the optical flow. The registration is achieved by associating the extracted line segments with the events inside the window. Expectation maximization-iterative closest point (EM-ICP) is adopted for the registration. Afterward, we use the estimated optical flow and the IMU data to update the position of line segments, and take them as the new templates. We evaluate our method on the public event camera datasets. The results show that our method can achieve comparable performance to other methods in terms of accuracy and tracking time. INDEX TERMS Feature tracking, event camera, EM-ICP, line segments, DAVIS.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.