Spectral algorithms offer a general and flexible framework for a broad range of machine learning problems and have attracted considerable attention recently. However, the theoretical properties of these algorithms are still largely unknown for infinite-dimensional functional data learning. To fill this void, we study the performance of spectral algorithms for functional linear regression within the framework of reproducing kernel Hilbert space. Despite the generality of the proposed methods, we show that they are easily implementable and can attain minimax rates of convergence for prediction in terms of regularity of the slope function, eigenvalue decay rate of the integral operator determined by both the reproducing kernel and the covariance kernel, and qualification of the filter function of the spectral algorithm. In addition, our analysis also pinpoints the benefits of spectral algorithms in overcoming the saturation effect of roughness regularization methods.