Abstract:Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work we … Show more
“…12a. Compared with CMOS analog and digital neurons in [15,27], STT-SNN leads to the possibility of more than two orders of magnitude lower energy dissipation. The LSV-based spin-neuron (step function) is around one order of magnitude larger than STT-SNN because of the large hard-axis preset energy [13].…”
Section: Application and Performance Resultsmentioning
confidence: 99%
“…It acts as a low power and compact current comparator that can be employed as energy efficient current mode hard limiting step function artificial neuron. Note that, this current can be further reduced by lowering the energy barrier or applying spin-orbital coupling [15].…”
Section: B Unipolar Domain Wall Motion Neuronmentioning
confidence: 99%
“…Previously, we proposed the application of hard-limiting spin-neurons based on lateral spin valves (LSV) [13], as well as domain wall motion (DWM) magnets [14,15] for designing ultra-low power artificial neural networks. Fig.…”
Section: Previous Work On Hard-limiting Spin-neuronsmentioning
confidence: 99%
“…Such magnetometallic devices can operate at ultra-low terminal voltages and can implement current-mode summation and non-linear operations required by an artificial neuron. We previously proposed the application of spin-torque neurons based on lateral spin value (LSV) and domain wall motion (DWM) magnet for designing ultra-low power neural networks [13][14][15]. However, all of the previously proposed spin-neurons implement the hard-limiting step-function, which leads to larger network size, and simply cannot provide adequate modeling accuracy for complex classification problems.…”
“…12a. Compared with CMOS analog and digital neurons in [15,27], STT-SNN leads to the possibility of more than two orders of magnitude lower energy dissipation. The LSV-based spin-neuron (step function) is around one order of magnitude larger than STT-SNN because of the large hard-axis preset energy [13].…”
Section: Application and Performance Resultsmentioning
confidence: 99%
“…It acts as a low power and compact current comparator that can be employed as energy efficient current mode hard limiting step function artificial neuron. Note that, this current can be further reduced by lowering the energy barrier or applying spin-orbital coupling [15].…”
Section: B Unipolar Domain Wall Motion Neuronmentioning
confidence: 99%
“…Previously, we proposed the application of hard-limiting spin-neurons based on lateral spin valves (LSV) [13], as well as domain wall motion (DWM) magnets [14,15] for designing ultra-low power artificial neural networks. Fig.…”
Section: Previous Work On Hard-limiting Spin-neuronsmentioning
confidence: 99%
“…Such magnetometallic devices can operate at ultra-low terminal voltages and can implement current-mode summation and non-linear operations required by an artificial neuron. We previously proposed the application of spin-torque neurons based on lateral spin value (LSV) and domain wall motion (DWM) magnet for designing ultra-low power neural networks [13][14][15]. However, all of the previously proposed spin-neurons implement the hard-limiting step-function, which leads to larger network size, and simply cannot provide adequate modeling accuracy for complex classification problems.…”
“…Such magneto-metallic devices can operate at ultra-low terminal voltages and can implement current-mode summation and comparison operations at ultra-low energy cost. Such current-mode spin switches or 'neurons' can be exploited in energy efficient analog-mode computing [28,34,35]. In this work we present the design of RCN based energy-efficient computing blocks for HTM using such 'spin neurons'.…”
Section: Hierarchical Temporal Memory Based On Spin-neurons and Resismentioning
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.