Reservoir computing (RC) is an efficient artificial neural network for model-free prediction and analysis of dynamical systems time series. As a data-based method, the capacity of RC is strongly affected by the time sampling interval of training data. In this paper, taking Lorenz system as an example, we explore the influence of this sampling interval on the performance of RC in predicting chaotic sequences. When the sampling interval increases, the prediction capacity of RC is first enhanced then weakened, presenting a bell-shaped curve. By slightly revising the calculation method of the output matrix, the prediction performance of RC with small sampling interval can be improved. Furthermore, RC can learn and reproduce the state of chaotic system with a large time interval, which is almost five times larger than that of the classic fourth-order Runge–Kutta method. Our results show the capacity of RC in the applications where the time sampling intervals are constrained and laid the foundation for building a fast algorithm with larger time iteration steps.