In this work, we optimize hardware consumption in Rayleigh channel generation. Due to the advantage in saving memory resource, a recursive-structure-based channel generator is considered in this paper. We first investigate the long-term accuracy of channel generation by the recursive structure. Afterwards, we calculate the optimum bit width and channel update period at which normalized mean-square-error (MSE) minimization is achieved. The minimization is under a constraint of a upper bound on a bit rate which is defined as the ratio of bit width over channel data update period. Via experiments on a Xilinx FPGA chip, the optimization in the generation of Rayleigh channel with U -shaped spectrum saves more than 31% bit rate for the normalized MSE of 1.8 × 10 −6 . According to our experiments, the 31% bit rate reduction can save up to 174 slices, 31 flip-flops, and 390 LUTs.Index Terms-Fading channel generator, field-programmable gate array (FPGA), joint optimization over sampling period and bits, stability of an iterative structure based sinusoid generator, sum of sinusoids.