Recent advances in quantized neural networks have paved the way for energy efficient hardware architectures for machine learning tasks. Binary and ternary quantized neural networks are suitable for image classification and recognition applications on highly resource constrained hardware. Binary neural networks (BNNs) have low precision thus suffer a significant accuracy loss for dense networks and large datasets. This issue can be resolved through ternary neural networks (TNNs) with higher weight precision and better resource utilization. TNN implementation using conventional CMOS and memristive devices show limited improvement in area and energy efficiency. Spintronics based magnetic random access memory (MRAM) devices are the most prominent choice amongst the various non-volatile memories for neural networks. This work presents the implementation of differential spin Hall effect (DSHE) MRAM-based two and three input ternary computation units (TCUs) for TNN. Furthermore, a multilayer perceptron architecture with synaptic crossbar array using the proposed TCU is implemented for MNIST data classification. The results show that DSHE-based TCU is 30% more energy efficient as compared with STT-MRAM based design. Furthermore, DSHE-MRAM based TNN shows improvement in energy and area by 82% and 9%, respectively, when compared to STT-based TNN.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.