The influence of laser chirp and parasitics on optimized dispersion maps (ODM) for multi-span directly-modulated/direct-detection (DD) systems operating at 10 Gbit·s −1 is assessed through numerical simulation.It is shown that the ODM uses under-compensation in the last span and, for high laser chirp levels, in-line dispersion compensation requires over-compensation. However, if a reduction of laser chirp and parasitics is achieved, under-compensation levels are required, and power levels similar to the ones reached by ODM intensity-modulated/DD systems can be launched into the transmission fiber for the same signal mask degradation.