There is an enduring issue on whether student-and school-level covariates should be included in value-added school effects models, in addition to prior achievement. Proponents argue that the addition of covariates allows fairer comparisons of schools, whereas opponents argue that it excuses poorly performing schools and obscures policy-relevant school differences. School-level covariates are problematic statistically, but it has been argued that mean school prior achievement should be included in school effects analyses to reduce error. This article reports on school effects analyses of Australia-wide data of approximately 1.5 million students in both primary and secondary schools that took national assessments in five achievement domains between 2013 and 2018. With appropriate controls for prior achievement, school effects are generally small and most often not statistically significant. The addition of student-level covariates: further reduces school effects, since part of the school effects is absorbed by the effects of the covariates, which are unlikely to reflect causal social processes; reduces the proportion of schools with significant school effects; does not improve predictive power; increases the amount of missing data; and further reduces the consistency of school effects between domains and their stability over time. Mean school prior achievement did not improve consistency or stability. Incorporating covariates in school effects analyses opens a Pandora's Box of specification and measurement issues, undermining the legitimacy of school comparisons. It is concluded that researchers and administrators of educational jurisdictions should focus mainly on simpler models based on prior achievement.