The energy consumption associated with data movement between memory and processing units is the main roadblock for the massive deployment of edge Artificial Intelligence. To overcome this challenge, Binarized Neural Networks (BNN) coupled with RRAM-based in-or nearmemory computing constitute an appealing solution. However, proposals from the literature tend to involve significant periphery circuit overheads. In this work, we propose and demonstrate experimentally, on a fabricated hybrid CMOS-RRAM integrated circuit, a robust in-memory XOR operation based on a 2T2R cell used in a resistive bridge manner. With this architecture, the RRAM read operation and the BNN multiplication operation can be achieved simultaneously, requiring only inverters connected to each Source Line of the memory array, and the BNN POPCOUNT operation can be realized with an analog capacitive neuron. Based on our measurements and extensive Monte Carlo simulations, we validate that this approach is suitable for large neurons with a low error rate (3.12% of error considering the full range of POPCOUNT values). Based on the circuit simulation results, we highlight the resilience of this approach at the network level, with a minimal accuracy degradation on the MNIST (0.07%) and CIFAR-10 (0.35%) tasks with regards to software solutions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.