The effects of time delay on the fluctuation properties of a bistable system are investigated by simulating its normalised correlation function C(s). Three cases including linear delay, cubic delay and global delay in the system are considered respectively. The simulation results indicate that the linear delay enhances the fluctuation of the system (reduces the stability of the system) while the cubic delay and global delay weaken it (enforce the stability of the system), and the effect of cubic delay is more pronounced than the linear delay and global delay.