Perceptron is an essential element in neural network (NN)-based machine learning, however, the effectiveness of various implementations by circuits is rarely demonstrated from chip testing. This paper presents the measured silicon results for the analog perceptron circuits fabricated in a 0.6 μm/±2.5 V complementary metal oxide semiconductor (CMOS) process, which are comprised of digital-to-analog converter (DAC)-based multipliers and phase shifters. The results from the measurement convinces us that our implementation attains the correct function and good performance. Furthermore, we propose the multi-layer perceptron (MLP) by utilizing analog perceptron where the structure and neurons as well as weights can be flexibly configured. The example given is to design a 2-3-4 MLP circuit with rectified linear unit (ReLU) activation, which consists of 2 input neurons, 3 hidden neurons, and 4 output neurons. Its experimental case shows that the simulated performance achieves a power dissipation of 200 mW, a range of working frequency from 0 to 1 MHz, and an error ratio within 12.7%. Finally, to demonstrate the feasibility and effectiveness of our analog perceptron for configuring a MLP, seven more analog-based MLPs designed with the same approach are used to analyze the simulation results with respect to various specifications, in which two cases are used to compare to their digital counterparts with the same structures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.