Trend, detection, and attribution analyses were performed using naturalized streamflow observations and routed land surface model runoff for 10 subbasins in the Columbia River Basin during water years 1951–2008. The Energy Exascale Earth System land‐surface model (ELM) version 1.0 and the Routing Application for Parallel computatIon of Discharge (RAPID) routing model were used to conduct semi‐factorial simulations driven by multiple sets of bias‐corrected forcing data sets. Four main potential drivers, including climate change (CLMT), CO2 concentration (CO2), nitrogen deposition (NDEP), and land use and land cover change (LULCC), were analyzed during the assessment. All subbasins showed significant (α = 0.10) declines in the observed amount of annual total streamflow, except for the Middle and Upper Snake and Upper Columbia Subbasins. These declines were led by significant decreases in June–October streamflow, which also directly led to significant decreases in peak and summer streamflow. Except for the Snake River Subbasins, LULCC had the same pattern of declines in monthly streamflow, but the period was shifted to May–September. NDEP also had significant trends in June–October; however, rather than decreases, the trends showed significant increases in streamflow. While there were significant trends in CO2, NDEP, and LULCC, their signals of change were weak in comparison to the signal in CLMT and the natural internal variability found in streamflow. Overall, the detection and attribution analysis showed that the historical changes found in annual total, center of timing of, and summer mean streamflow could be attributed to changing climate and variability.