Corrosion in water-distribution systems is a costly problem and controlling corrosion is a primary focus of efforts to reduce lead (Pb) and copper (Cu) in tap water. High chloride concentrations can increase the tendency of water to cause corrosion in distribution systems. The effects of chloride are also expressed in several indices commonly used to describe the potential corrosivity of water, the chloride-sulfate mass ratio (CSMR) and the Larson Ratio (LR). Elevated CSMR has been linked to the galvanic corrosion of Pb whereas LR is indicative of the corrosivity of water to iron and steel. Despite the known importance of chloride, CSMR, and LR to the potential corrosivity of water, monitoring of seasonal and interannual changes in these parameters is not common among water purveyors. We analyzed long-term trends (1992-2012) and the current status (2010-2015) of chloride, CSMR, and LR in order to investigate the short and long-term temporal variability in potential corrosivity of US streams and rivers. Among all sites in the trend analyses, chloride, CSMR, and LR increased slightly, with median changes of 0.9mgL, 0.08, and 0.01, respectively. However, urban-dominated sites had much larger increases, 46.9mgL, 2.50, and 0.53, respectively. Median CSMR and LR in urban streams (4.01 and 1.34, respectively) greatly exceeded thresholds found to cause corrosion in water distribution systems (0.5 and 0.3, respectively). Urbanization was strongly correlated with elevated chloride, CSMR, and LR, especially in the most snow-affected areas in the study, which are most likely to use road salt. The probability of Pb action-level exceedances (ALEs) in drinking water facilities increased along with raw surface water CSMR, indicating a statistical connection between surface water chemistry and corrosion in drinking water facilities. Optimal corrosion control will require monitoring of critical constituents reflecting the potential corrosivity in surface waters.