Several remediation technologies are currently used to address groundwater pollution. “Pump and treat” (P&T) is probably one of the most widely applied, being a process where contaminated groundwater is extracted from the subsurface by pumping and then treated before it is discharged or reinjected into the aquifer. Despite being a very adaptable technology, groundwater remediation is often achieved in long and unsustainable times because of limitations due to the hydrogeological setting and contaminant properties. Therefore, the cost–benefit analysis over time results in an inefficient system and a preliminary evaluation of the clean-up time is crucial. The aim of the paper is to compare, in an integrated manner, the application of some models to estimate the time to compliance of a P&T system in relation to the specific hydrogeological condition. Analytical solutions are analyzed and applied to an industrial site and to a synthetic case. For both cases, batch flushing and the advection-dispersion-retardation (ADR) model underestimate remediation times comparing the results to real or simulated monitoring data, whereas the Square Root model provided more reliable remediation times. Finally, for the synthetic case, the reliability of analytical approaches and the effects of matrix diffusion are tested on the basis of a numerical groundwater transport model specifically implemented, which confirm the results of the analytical methods and the strong influence of the matrix diffusion on the results.