Event cameras are novel neuromorphic sensors, which asynchronously capture pixel-level intensity changes in the form of "events". Event simulation from existing RGB datasets is commonly used to overcome the need of large amount of annotated data, which lacks due to the novelty of the event sensors. In this context, the possibility of using event simulation in synthetic scenarios, where data generation is not limited to pre-existing datasets, is to date still unexplored. In this work, we analyze the synth-to-real domain shift in event data, i.e., the gap arising between simulated events obtained from synthetic renderings and those captured with a real camera on real images. To this purpose, we extend to the event modality the popular RGB-D Object Dataset (ROD), which already comes with its synthetic version (SynROD). The resulting Neuromorphic ROD dataset (N-ROD) is the first to enable a synth-to-real analysis on event data, showing the effectiveness of Domain Adaptation techniques in reducing the synth-to-real shift. Moreover, through extensive experiments on multi-modal RGB-E data, we show that events can be effectively combined with conventional visual information, encouraging further research in this area. The N-ROD dataset is available at https://N-ROD-dataset.github.io/home/.