US regulatory ultraviolet (UV) disinfection credit is typically granted when turbidity is ≤1 NTU. However, studies show turbidity does not always correlate well with UV dose responses. This study examined the impact of worst‐case high turbidity scenarios at drinking water treatment plants on UV254 inactivation of indigenous spores from unfiltered source water and unsettled flocculation and softening steps. Flocculated water (turbidity = 6.49–164 NTU) had the lowest dose response with a significantly lower Geeraerd‐tail maximum inactivation rate (kmax = 0.021 cm2/mJ) and higher residual population density (Nres = 7.081 SFU/mL). Raw source water (kmax = 0.027 cm2/mJ, Nres = 1.168 SFU/mL, turbidity = 0.978–215 NTU) and softened water (kmax = 0.030 cm2/mJ, Nres = 0.216 SFU/mL, turbidity = 318–495 NTU) had similar dose responses despite significantly different water quality. Particle size and the degree of particle‐associated spores best explained the differences in dose responses. Almost all spores were associated with flocculated particles instead of free‐floating, which increased tailing and negatively impacted UV inactivation. Based on regulatory reduction equivalent dose bias factors and UV sensitivities of spiked Bacillus subtilis spores, Cryptosporidium would be 4‐log inactivated in these raw, flocculated, or softened waters if UV transmission were ≥65%, 90%, or 80%, respectively, even though turbidity was grossly >1 NTU. Depending on particle characteristics, partial inactivation credit when turbidity is >1 NTU should be considered to avoid high‐tier violations while still protecting public health.