This paper discusses model order reduction of discrete-time linear time-delayed systems over limited frequency interval. Firstly, a finite-frequency index is introduced to characterize the desired approximation performance over the pre-specified frequency interval. By exploiting the finite-frequency analysis results for linear delay systems, sufficient criterions for guaranteeing stability of the reduced-order model and optimizing the finite-frequency approximation error are derived with the aid of matrix inequality techniques. The finitefrequency model order reduction problem then is converted to a LMI-based optimization problem, which can be solved easily. Finally, a numerical example is given to illustrate the effectiveness of the proposed results.