In this study, we conduct a thorough assessment of the effect of jitter occurrence in high speed 10 Gbps and 200 GHz Wavelength-Division Multiplexing (WDM) optical network. First, we present a simulation model to study the effect of jitter presence in the proposed network and then determine the maximum amount of jitter which the network can withstand. The model is then employed to predict the types of jitter received at the end of the transmission line. For the input power level of 0 dBm and Bit Error Rate (BER) of 1E −09 , the observed total jitter, J T , random jitter, J R and deterministic jitter, J D is 0.2676 UI, 0.1602 UI and 0.1073 UI, respectively.