Time delays, modelling the process of intracellular gene expression, have been shown to have important impacts on the dynamics of pattern formation in reaction-diffusion systems. In particular, past work has shown that such time delays can shrink the Turing space, thereby inhibiting patterns from forming across large ranges of parameters. Such delays can also increase the time taken for pattern formation even when Turing instabilities occur. Here we consider reaction-diffusion models incorporating fixed or distributed time delays, modelling the underlying stochastic nature of gene expression dynamics, and analyze these through a systematic linear instability analysis and numerical simulations for several sets of different reaction kinetics. We find that even complicated distribution kernels (skewed Gaussian probability density functions) have little impact on the reaction-diffusion dynamics compared to fixed delays with the same mean delay. We show that the location of the delay terms in the model can lead to changes in the size of the Turing space (increasing or decreasing) as the mean time delay, τ, is increased. We show that the time to pattern formation from a perturbation of the homogeneous steady state scales linearly with τ, and conjecture that this is a general impact of time delay on reaction-diffusion dynamics, independent of the form of the kinetics or location of the delayed terms. Finally we show that while initial and boundary conditions can influence these dynamics, particularly the time-to-pattern, the effects of delay appear robust under variations of initial and boundary data. Overall our results help clarify the role of gene expression time delays in reaction-diffusion patterning, and suggest clear directions for further work in studying more realistic models of pattern formation.