“…According to some theories, feedforward functional units perform a common computation: take the afferent input, apply a nonlinearity, adjust the synaptic weights that are being linearly summed, and normalize by the aggregate activity of a nearby neural pool (DiCarlo et al, 2012; Riesenhuber and Poggio, 1999). Repetitive computation throughout multiple layers is also the essence of deep convolutional neural networks, which have recently proven useful in visual neuroscience (Kriegeskorte, 2015; Krizhevsky et al, 2012; Yamins et al, 2014). The architecture consists of repetitive local operations like nonlinear thresholding, normalization, max-pooling, and convolution until a final decision is made.…”