2022
DOI: 10.1109/access.2022.3196688
|View full text |Cite
|
Sign up to set email alerts
|

Energy Efficient Learning With Low Resolution Stochastic Domain Wall Synapse for Deep Neural Networks

Abstract: We demonstrate extremely low resolution quantized (nominally 5-state) synapses with large stochastic variations in synaptic weight can be energy efficient and achieve reasonably high testing accuracies compared to Deep Neural Networks (DNNs) of similar sizes using floating precision synaptic weights. Specifically, voltage-controlled domain wall (DW) devices demonstrate stochastic behavior and can only encode limited states; however, they are extremely energy efficient during both training and inference. In thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(18 citation statements)
references
References 55 publications
0
18
0
Order By: Relevance
“…The same thresholding algorithm is used here as in the ideal 5-bit synapse case (black plots in Figure ), but the weight update now deviates from the fixed value of + W or − W as per the nonlinearity and asymmetry of the LTP and LTD curves (Figure (a)). The drop in accuracy for both data sets between the case for ideal 5-bit synapses and that for experimentally obtained 5-bit synapses happens because of the error introduced in the training due to nonlinearity and asymmetry in synaptic behavior. …”
Section: Resultsmentioning
confidence: 97%
See 1 more Smart Citation
“…The same thresholding algorithm is used here as in the ideal 5-bit synapse case (black plots in Figure ), but the weight update now deviates from the fixed value of + W or − W as per the nonlinearity and asymmetry of the LTP and LTD curves (Figure (a)). The drop in accuracy for both data sets between the case for ideal 5-bit synapses and that for experimentally obtained 5-bit synapses happens because of the error introduced in the training due to nonlinearity and asymmetry in synaptic behavior. …”
Section: Resultsmentioning
confidence: 97%
“…Even though idealistic simulations show extremely linear and symmetric domain-wall and skyrmion motion, it is hard to achieve such motion experimentally given the presence of defects, thermal noise, and other nonidealities. But most studies on spintronic synaptic devices, reported thus far, are based on idealistic micromagnetic simulations of the devices (leading to very linear and symmetric LTP and LTD) and subsequent device-system co-design and co-simulations of crossbar arrays based on these devices. ,,, Unless the synaptic device model is based on actual experiments, it does not capture nonlinearities in LTP and LTD, reduction in bit resolution, cycle-to-cycle variations, etc., and this leads to the underestimation of errors in classification made by crossbar arrays simulated based on it. …”
Section: Introductionmentioning
confidence: 99%
“…Here, ɳ is defined as a random variable having the Gaussian distribution with zero mean and unit variance and independent in each of the three Cartesian coordinates obtained at each time step. kB, T, V, and ∆𝑡 are the Boltzmann constant, temperature, volume of each simulation cell, and time step, respectively [44,45]. The dimension of the nanotrack is considered as 324x120x2 nm 3 for the simulations.…”
Section: Methodsmentioning
confidence: 99%
“…Due to the tunability of DW-MTJ neurons and synapses, there are opportunities to design neural network accelerators that can benefit from the characteristics outlined in the previous sections. Due to the linear and symmetric updates that have been demonstrated for DW-MTJ devices [126,127], the devices are effective artificial synapses for deep neural networks. On top of this, the tunability of the magnetic dynamics of the DW-MTJ synapses can be leveraged for specific types of deep neural networks.…”
Section: Dw-mtj Neural Network Designmentioning
confidence: 99%