2022
DOI: 10.1109/tcds.2021.3097675
|View full text |Cite
|
Sign up to set email alerts
|

An End-to-End Spiking Neural Network Platform for Edge Robotics: From Event-Cameras to Central Pattern Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“…Thanks to their desirable characteristics, SNNs have gathered interest in a range of robotics applications, including control [5]- [8], manipulation [9], [10], scene understanding [11], and object tracking [12]. Key works that use spiking networks for robot localization, the task considered in this paper, include an energy-efficient uni-dimensional SLAM [39], a robot navigation controller system [40], a pose estimation and mapping system [41], and models of the place, grid and border cells of rat hippocampus [42] based on RatSLAM [43].…”
Section: A Spiking Neural Network In Robotics Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…Thanks to their desirable characteristics, SNNs have gathered interest in a range of robotics applications, including control [5]- [8], manipulation [9], [10], scene understanding [11], and object tracking [12]. Key works that use spiking networks for robot localization, the task considered in this paper, include an energy-efficient uni-dimensional SLAM [39], a robot navigation controller system [40], a pose estimation and mapping system [41], and models of the place, grid and border cells of rat hippocampus [42] based on RatSLAM [43].…”
Section: A Spiking Neural Network In Robotics Researchmentioning
confidence: 99%
“…SNNs have thus been used in a number of robotics applications [5]- [12], including the visual place recognition (VPR) task [13], [14] that is considered in this paper. A VPR system has to find the matching reference image given a query image of a place, with the difficulty that the appearance of the query image can differ significantly from the reference image due to change in season, time of the day, or weather conditions [15]- [18].…”
Section: Introductionmentioning
confidence: 99%
“…• HoLLiE arm [31] • FPGA ZEM4310, Emotive Epoc headset [23] • DVS, NAS [102] • UR3 with elbow and wrist [105] • Prosthetic hand [86] • Tobii Eye Tracker, YDLIDAR G4, Raspberry Pi 3 [100] Cognition and Learning [117] • Robotic hand, Myo armband, Epoc headset [25] [118] • Loihi, Neural Computer Stick2 [112] • Spartan-6, RGB sensor [113] • Raspberry Pi3, PiCamera, PiStrom [117] • Loihi, LIDAR [114] TABLE 6: Software/ simulators and neuron models for all the applications.…”
Section: Soundmanmentioning
confidence: 99%
“…The experimental results show that the robot was able to replicate the gait pattern generated through the user's mental activity with a slight delay. Similarly, Lele et al [102] used CPG and coupled it with Dynamic Vision Sensor (DVS) for a prey-tracking scenario in the close-loop robotic system. Here the legs of a Hexapod robot are controlled by a network of spiking neurons.…”
Section: Motor Control Applicationsmentioning
confidence: 99%
“…Another neuro-inspired system exploration is the first autonomous sensing-to-actuation end-to-end spike-only processing pipeline for hexapod robots [35]. The goal is to demonstrate the functionality of spike-only processing and evaluate the potential of event-driven processing modalities.…”
Section: Neuro-inspired End-to-end Spike-only Processingmentioning
confidence: 99%