The massive technical and computational progress of biomolecular crystallography has generated some adverse side effects. Most crystal structure models, produced by crystallographers or well‐trained structural biologists, constitute useful sources of information, but occasional extreme outliers remind us that the process of structure determination is not fail‐safe. The occurrence of severe errors or gross misinterpretations raises fundamental questions: Why do such aberrations emerge in the first place? How did they evade the sophisticated validation procedures which often produce clear and dire warnings, and why were severe errors not noticed by the depositors themselves, their supervisors, referees and editors? Once detected, what can be done to either correct, improve or eliminate such models? How do incorrect models affect the underlying claims or biomedical hypotheses they were intended, but failed, to support? What is the long‐range effect of the propagation of such errors? And finally, what mechanisms can be envisioned to restore the validity of the scientific record and, if necessary, retract publications that are clearly invalidated by the lack of experimental evidence? We suggest that cognitive bias and flawed epistemology are likely at the root of the problem. By using examples from the published literature and from public repositories such as the Protein Data Bank, we provide case summaries to guide correction or improvement of structural models. When strong claims are unsustainable because of a deficient crystallographic model, removal of such a model and even retraction of the affected publication are necessary to restore the integrity of the scientific record.