Time delays have been commonly used in modeling biological systems and can significantly change the dynamics of these systems. Quite a few works have been focused on analyzing the effect of small delays on the pattern formation of biological systems. In this paper, we investigate the effect of any delay on the formation of Turing patterns of reaction-diffusion equations. First, for a delay system in a general form, we propose a technique calculating the critical value of the time delay, above which a Turing instability occurs. Then we apply the technique to a predator-prey model and study the pattern formation of the model due to the delay. For the model in question, we find that when the time delay is small it has a uniform steady state or irregular patterns, which are not of Turing type; however, in the presence of a large delay we find spiral patterns of Turing type. For such a model, we also find that the critical delay is a decreasing function of the ratio of carrying capacity to half saturation of the prey density.