Observations show that stellar streams originating in satellite dwarf galaxies are frequent in the Universe. While such events are predicted by theory, it is not clear how many of the streams that are generated are washed out afterwards to the point in which it is imposible to detect them. Here we study how these diffusion times are affected by the fact that typical gravitational potentials of the host galaxies can sustain chaotic orbits. We do this by comparing the behaviour of simulated stellar streams that reside in chaotic or non-chaotic regions of the phase-space. We find that chaos does reduce the time interval in which streams can be detected. By analyzing detectability criteria in configuration and velocity space, we find that the impact of these results on the observations depends on the quality of both the data and the underlying stellar halo model. For all the stellar streams, we obtain a similar upper limit to the detectable mass.