2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489241
|View full text |Cite
|
Sign up to set email alerts
|

Confronting machine-learning with neuroscience for neuromorphic architectures design

Abstract: Artificial neural networks are experiencing today an unprecedented interest thanks to two main changes: the explosion of open data that is necessary for their training, and the increasing computing power of today's computers that makes the training part possible in a reasonable time. The recent results of deep neural networks on image classification has given neural networks the leading role in machine learning algorithms and artificial intelligence research. However, most applications such as smart devices or… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 33 publications
(22 citation statements)
references
References 17 publications
0
22
0
Order By: Relevance
“…Second, the spiking neuron is the Leaky Integrate-and-Fire (LIF) which is a simplified model of bio-mimetic artificial neurons that try to mimic the behavior of biological neurons. The LIF computation is hence simpler than that of the formal neuron, and we have previously made a comparative study of the information coding impact of both spiking and formal models on neuromorphic architectures with off-line supervised learning [12]. It showed that the SNN has approximately 50% gain in the hardware implementation cost (resources and power).…”
Section: B Stdp-based Spiking Neural Networkmentioning
confidence: 99%
“…Second, the spiking neuron is the Leaky Integrate-and-Fire (LIF) which is a simplified model of bio-mimetic artificial neurons that try to mimic the behavior of biological neurons. The LIF computation is hence simpler than that of the formal neuron, and we have previously made a comparative study of the information coding impact of both spiking and formal models on neuromorphic architectures with off-line supervised learning [12]. It showed that the SNN has approximately 50% gain in the hardware implementation cost (resources and power).…”
Section: B Stdp-based Spiking Neural Networkmentioning
confidence: 99%
“…Finally, multimodal association bridges the gap between unsupervised and supervised learning, as we obtain approximately the same results compared to MNIST using a supervised Multi-Layer Perceptron (MLP) with 95.73% [104] and S-MNIST using a supervised attention Recurent Neural Network (RNN) with 94.5% [115] (even though this results was obtained on 20 commands). Multimodal association can also be seen as way to reach the same accuracy of about 95% as Reference [49] with much less neurons, going from 6400 neurons to 356 neurons, that is, a gain of 94% in the total number of neurons.…”
Section: A Universal Multimodal Association Model?mentioning
confidence: 63%
“…Nevertheless, the STDP-based multimodal learning is still a promising approach for the hardware efficiency of SNNs [104], and because of the alternative they offer for using event-based sensors with asynchronous computation [105].…”
mentioning
confidence: 99%
“…In the low-spiking-rate region, the energy consumption per spike in the OCC part increases because the energy consumption in the OCC part is dominated by leakage components in the idle time; hence, the OCC part exhibits a better energy efficiency in the accelerated spikingrate region, where the total energy consumption is dominated by dynamic power. Several architectures of the SNN accelerator are under investigation to show better energy efficiency compared to the standard ANN accelerators, which have exhibited similar or better performance in terms of energy efficiency [35], [36]. Therefore, in the future, SNN accelerators that adopt the OCC part can be energy efficient with synaptic off-current blocking operations.…”
Section: Pre-layer Spikementioning
confidence: 99%