“…However, these methods need an enormous amount of calculations, which increases the power consumption and the delay time at the receiver. On the other hand, artificial neural network (ANN)-based nonlinear equalizers are attracting attention because of their lower computational complex-ity [3,4,5,6,7]. In recent years, the rectified linear unit (ReLU) has often been employed as an activation function of the ANN-unit instead of the conventional sigmoid function in deep neural networks (DNNs) used for, e.g., speech recognition and image recognition [8,9].…”