In recent years, the evaluation of water quality in distribution systems has generated enormous interest in the scientific community due to the increasing concentration of population in urban areas and frequent issues connected with supply water quality. Following the wave of bioterrorism subsequent the events of September 11th 2001, a need can be foreseen to seek adequate preventive measures to deal with contamination in water distribution systems that may be related to the accidental contamination and deliberate injection of toxic agents of any origin in the distribution networks. Therefore, it is very important to create a sensor system that detects contamination events in real time, while maintaining the reliability and efficiency of the measurements, limiting the cost of the instrumentation. A reliable monitoring system, for this kind of problems, cannot be deployed without realistic modelling support. The current state-of-the-art in water distribution systems analysis usually adopt a simplified approach to water quality modelling, neglecting dispersion and diffusion and considering simplified reaction kinetics. Even if such simplifications are commonly acceptable in fully turbulent flows, they may take to relevant errors in transition flows with low velocity thus taking to unreliable interpretation of the contamination in complex networks. The present paper aims to compare different modelling approaches to the evaluation of contaminant dispersion in two distribution networks: one laboratory network in which contamination experiments were carried out in a controlled environment (Enna, Italy) and a full-scale real distribution network (Zandvoort, Netherlands).