Robotics: Science and Systems XVI 2020
DOI: 10.15607/rss.2020.xvi.020
|View full text |Cite
|
Sign up to set email alerts
|

Event-Driven Visual-Tactile Sensing and Learning for Robots

Abstract: This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
81
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 82 publications
(81 citation statements)
references
References 50 publications
0
81
0
Order By: Relevance
“…SLAYER with Loihi has also been used for tactile digit recognition by See et al [38] and for sensor fusion because of the ease of combining modalities in the spike domain. Ceolini et al [39] combined EMG and vision data in a gesture classification task, and Taunyazov et al [40] combined vision and tactile data in a grasping task.…”
Section: B Direct Deep Snn Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…SLAYER with Loihi has also been used for tactile digit recognition by See et al [38] and for sensor fusion because of the ease of combining modalities in the spike domain. Ceolini et al [39] combined EMG and vision data in a gesture classification task, and Taunyazov et al [40] combined vision and tactile data in a grasping task.…”
Section: B Direct Deep Snn Trainingmentioning
confidence: 99%
“…Much work remains to be done on the spiking algorithm front, but steady progress is being made. Early efforts demonstrate digit recognition, fusion of visual and tactile perception [40], fusion of visual and EMG information [39], persistent attention and tracking [68], and online learning of gestures [102] using event-based sensors interfaced to Loihi.…”
Section: A Event-based Sensing and Perceptionmentioning
confidence: 99%
“…A few other works tried to embed multiple camera sensors inside the tactile sensor to retrieve the best possible internal tactile force fields [ 22 ]. On the other hand, there has been an appeal and enthusiasm towards the learning-based approaches inculcating deep learning for the estimation of tactile information [ 23 ]. The visual-based tactile sensing mechanism can be typically classified into two approaches, such as traditional image processing/computer vision-based methods and learning-based methods.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the earliest work [25], we proposed a method to utilizuse an event-based camera (Dynamic Vision Sensor) in order to detect incipient slippage using traditional image processing techniques. Similarly, new approaches for slip detection with the DVS are investigated in [26][27][28]. In another work, we proposed a novel framework based on the Dynamic Vision Sensor (DVS) to acquire force magnitude and classify materials in a grasp [13].…”
Section: Related Workmentioning
confidence: 99%