Abstract-This paper describes the application of a Convolutional Neural Network (CNN) in the context of a predator/prey scenario. The CNN is trained and run on data from a Dynamic and Active Pixel Sensor (DAVIS) mounted on a Summit XL robot (the predator), which follows another one (the prey). The CNN is driven by both conventional image frames and dynamic vision sensor "frames" that consist of a constant number of DAVIS ON and OFF events. The network is thus "data driven" at a sample rate proportional to the scene activity, so the effective sample rate varies from 15 Hz to 240 Hz depending on the robot speeds. The network generates four outputs: steer right, left, center and non-visible. After off-line training on labeled data, the network is imported on the on-board Summit XL robot which runs jAER and receives steering directions in real time. Successful results on closed-loop trials, with accuracies up to 87% or 92% (depending on evaluation criteria) are reported. Although the proposed approach discards the precise DAVIS event timing, it offers the significant advantage of compatibility with conventional deep learning technology without giving up the advantage of datadriven computing.
Identification of the material from which an object is made is of significant value for effective robotic grasping and manipulation. Characteristics of the material can be retrieved using different sensory modalities: vision based, tactile based or sound based. Compressibility, surface texture and thermal properties can each be retrieved from physical contact with an object using tactile sensors. This paper presents a method for collecting data using a biomimetic fingertip in contact with various materials and then using these data to classify the materials both individually and into groups of their type. Following acquisition of data, principal component analysis (PCA) is used to extract features. These features are used to train seven different classifiers and hybrid structures of these classifiers for comparison. For all materials, the artificial systems were evaluated against each other, compared with human performance and were all found to outperform human participants' average performance. These results highlighted the sensitive nature of the BioTAC sensors and pave the way for research that requires a sensitive and accurate approach such as vital signs monitoring using robotic systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.