Hardware implementations of spiking neural networks, which are known as neuromorphic architectures, provide an explicit understanding of brain performance. As a result, biological features of the brain may well inspire the next generation of computers and electronic systems used in such areas as signal processing, image processing, function approximation, and pattern recognition. Approximating nonlinear functions has many uses in computer science and applied mathematics. The sigmoid is the most universal activation function in neural networks by which the relationship between biological and artificial neurons is defined. It is a suitable option for predicting the probability of anything from 0 to 1 as output. In this paper, a spiking neural network using Izhikevich neurons and a gradient descent learning algorithm are propounded to approximate the sigmoid and other nonlinear functions. The flexibility of the spiking network is demonstrated by showing the average relative errors in the approximation process. A time-and cost-efficient digital neuromorphic implementation on the base of on-chip learning method for approximating the sigmoid function is also discussed. The paper reports the results of the hardware synthesis and the spiking network's physical implementation on a field-programmable gate array. The maximum frequency and throughput of the implemented network were 83.209 MHz and 9.86 Mb/s, respectively.
K E Y W O R D Sdigital hardware, field-programmable gate array (FPGA), hardware implementation, neuromorphic, on-chip learning, spiking neural networks (SNNs)