In this paper, we propose a learning rule based on a back-propagation (BP) algorithm that can be applied to a hardware-based deep neural network (HW-DNN) using electronic devices that exhibit discrete and limited conductance characteristics. This adaptive learning rule, which enables forward, backward propagation, as well as weight updates in hardware, is helpful during the implementation of power-efficient and high-speed deep neural networks. In simulations using a three-layer perceptron network, we evaluate the learning performance according to various conductance responses of electronic synapse devices and weight-updating methods. It is shown that the learning accuracy is comparable to that obtained when using a software-based BP algorithm when the electronic synapse device has a linear conductance response with a high dynamic range. Furthermore, the proposed unidirectional weight-updating method is suitable for electronic synapse devices which have nonlinear and finite conductance responses. Because this weight-updating method can compensate the demerit of asymmetric weight updates, we can obtain better accuracy compared to other methods. This adaptive learning rule, which can be applied to full hardware implementation, can also compensate the degradation of learning accuracy due to the probable device-to-device variation in an actual electronic synapse device.
In this paper, we reviewed the recent trends on neuromorphic computing using emerging memory technologies. Two representative learning algorithms used to implement a hardwarebased neural network are described as a bio-inspired learning algorithm and software-based learning algorithm, in particular back-propagation. The requirements of the synaptic device to apply each algorithm were analyzed. Then, we reviewed the research trends of synaptic devices to implement an artificial neural network.
We propose a designing of multi-layer neural networks using 2D NAND flash memory cell as a high-density and reliable synaptic device. Our operation scheme eliminates the waste of NAND flash cells and allows analogue input values. A 3-layer perceptron network with 40,545 synapses is trained on a MNIST database set using an adaptive weight update method for hardware-based multi-layer neural networks. The conductance response of NAND flash cells is measured and it is shown that the unidirectional conductance response is suitable for implementing multi-layer neural networks using NAND flash memory cells as synaptic devices. Using an online-learning, we obtained higher learning accuracy with NAND synaptic devices compared to that with a memristor-based synapse regardless of weight update methods. Using an adaptive weight update method based on a unidirectional conductance response, we obtained a 94.19% learning accuracy with NAND synaptic devices. This accuracy is comparable to 94.69% obtained by synapses based on the ideal perfect linear device. Therefore, NAND flash memory which is mature technology and has great advantage in cell density can be a promising synaptic device for implementing high-density multi-layer neural networks. INDEX TERMS Neuromorphic, NAND flash memory, deep neural networks (DNNs), synaptic device, deep learning, multi-layer neural networks, hardware-based neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.