Since the 1970s, scientists have developed statistical methods intended to formalize detection of changes in global climate and to attribute such changes to relevant causal factors, natural and anthropogenic. Detection and attribution (D&A) of climate change trends is commonly performed using a variant of Hasselmann’s “optimal fingerprinting” method, which involves a linear regression of historical climate observations on corresponding output from numerical climate models. However, it has long been known in the field of time series analysis that regressions of “non-stationary” or “trending” variables are, in general, statistically inconsistent and often spurious. When non-stationarity is caused by “integrated” processes, as is likely the case for climate variables, consistency of least-squares estimators depends on “cointegration” of regressors. This study has shown, using an idealized linear-response-model framework, that if standard assumptions hold then the optimal fingerprinting estimator is consistent, and hence robust against spurious regression. In the case of global mean surface temperature (GMST), parameterizing abstract linear response models in terms of energy balance provides this result with physical interpretability. Hypothesis tests conducted using observations of historical GMST and simulation output from 13 CMIP6 general circulation models produced no evidence that standard assumptions required for consistency were violated. It is therefore concluded that, at least in the case of GMST, detection and attribution of climate change trends is very likely not spurious regression. Furthermore, detection of significant cointegration between observations and model output indicates that the least-squares estimator is “superconsistent”, with better convergence properties than might previously have been assumed. Finally, a new method has been developed for quantifying D&A uncertainty, exploiting the notion of cointegration to eliminate the need for pre-industrial control simulations.