“…However, implementing DNN learning using backpropagation requires complex digital circuits that are unsuitable for edge intelligence hardware, which requires simple circuits with low power consumption [ 9 , 10 , 11 , 12 , 13 , 14 , 15 ]. Furthermore, backpropagation is a nonlocal learning algorithm that requires a significant amount of buffer memory to store all the neuronal and synaptic information from an entire network [ 16 , 17 , 18 , 19 , 20 ]. Alternatively, brain-mimicking learning algorithms, such as spike-timing-dependent plasticity (STDP), can be considered [ 21 , 22 , 23 , 24 , 25 ].…”