A theoretical analysis is presented to evaluate the impact of fiber chromatic dispersion on the bit error rate (BER) performance of a two dimensional wavelength hopping time spreading optical CDMA system. A bipolar-unipolar coding scheme has been considered for 7 chip m-sequence and 31 and 127 chip Gold sequence considering a standard single mode fiber. The numerical results show that there is an order of magnitude increase in BER due to the effect of dispersion. For 100 simultaneous users, at a received power of -4dbm, average BER occurs at about 10 -6 and 10 -12 with and without chromatic dispersion respectively for 7-chip m-sequence at a chip rate of 10 Gchip/sec for a fiber length of 245.05 km, with dispersion coefficient of 16 ps/km-nm. For 31-chip Gold sequence, the corresponding average BER is of the order of 10 -17 and 10 -21 with and without dispersion respectively. It is further noticed that for 10 simultaneous users, the system operating at a chip rate of 10Gchip/sec, suffers a power penalty of 9.15 dB for 7-chip m-sequence and 1dB for 31-chip Gold sequence at a BER of 10 -9 .