2020
DOI: 10.1109/mim.2020.8979519
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear retinal response modeling for future neuromorphic instrumentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

3
6

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 16 publications
1
3
0
Order By: Relevance
“…For the smallest measured time step of dt = 10 −6 s in the dual path (i.e., rod and cone cells are connected via electrical synapses) the method implemented is over 50% times faster than the benchmark. Similar speed improvements were noted in the C++ implementation [18].…”
Section: Resultssupporting
confidence: 73%
“…For the smallest measured time step of dt = 10 −6 s in the dual path (i.e., rod and cone cells are connected via electrical synapses) the method implemented is over 50% times faster than the benchmark. Similar speed improvements were noted in the C++ implementation [18].…”
Section: Resultssupporting
confidence: 73%
“…Neurons can respond to sensory stimuli over an enormous dynamic range. In the retina, neurons can detect individual photons to an influx of millions of photons [96], [97], [98], [99], [100]. To handle such widely varying stimuli, sensory transduction systems likely compress stimulus intensity with a logarithmic dependency.…”
Section: Input Data To An Snn May Be Converted Into a Firing Rate A F...mentioning
confidence: 99%
“…UtilizationUse of event-based cameras for tactile sensing applications provides a much higher sampling rate as well as significant reduction of power consumption. Neuromorphic vision sensors have become significantly popular recently and introduce a paradigm in computer vision applications for instrumentation and measurements [29,30]. The Dynamic Vision Sensor (DVS) that is used in this paper is one of the well-known neuromorphic cameras.…”
Section: Neuromorphic Vision-based Tactile Sensormentioning
confidence: 99%