We propose look-up tables networks-based reservoir computing (LUTNet-RC). This work is the first trial of applying LUTNets to RC. LUTNet-RC consists of a LUT-based reservoir layer and a non-LUT-based output layer. LUTNets have disadvantages such as limited sparse connectivity and weights cannot be changed after implementation. However, when applied to a reservoir layer of RC (LUT-based reservoir layer), these disadvantages are eliminated, because this layer works with sparse connectivity and the weights are fixed, so only the advantage of small circuit resources is obtained. For the LUT-based reservoir layer, we propose and model a multi-bit weight reservoir, modifying the conventional binarized reservoir to improve calculation accuracy. In the case of LUTNets, the proposed multi-bit weight reservoir can be implemented without the increase in utilized circuit resources because LUTNets focus only on the input-output relationship on neurons. Additionally, we propose a speed-up method in the output layer with time division calculation, which compares the current network state with previous states and then calculates only status-changed neurons. As a result, we implement a LUTNet-RC with 1500 reservoir neurons on a field-programmable gate array (KR260) running at 100MHz. The utilized circuit resources are dominated by LUTs, which use approximately 26% of the total amount of LUTs. The LUTNet-RC can infer more than 10 6 data per second. We also verify the LUTNet-RC performance using nonlinear auto-regressive moving average 10 (NARMA10) and the performance is comparable to conventional works. We conclude that the LUTNet-RC is one of the highest-performance RC on an FPGA.Index Terms-neural networks, reservoir computing, field programmable gate array This work focuses on look-up tables networks (LUTNets)[7], one of the binarized neural networks (BNNs) [8], which is more specialized for FPGA implementation than general BNNs. LUTNets are neural networks constructed using neurons oriented to LUT elements on FPGAs, focusing only on the input-output relationship of neurons. LUTNets can operate with smaller circuit resources and higher recognition accuracy than general BNNs implemented on FPGAs [9] [10].LUTNets have disadvantages such as limited sparse connectivity, and weights cannot be changed after implementation. However, when applied to a reservoir layer of RC, these disadvantages are eliminated because this layer works with sparse connectivity, and the weights are fixed; thus, only the advantage of small circuit resources is obtained.From this good complementarity between RC and LUT-Nets, we propose LUTNet-based RC (LUTNet-RC), which consists of a LUT-based reservoir layer and a non-LUT-based output layer. We implemented LUTNet-RC on an FPGA and evaluated its computational accuracy and speed and circuit resources. The main contributions of this work are summarized as follows.• We discovered the good complementarity between RC and LUTNet and proposed LUTNet-RC, which is an application of LUTNet to RC. • We proposed a novel...