SUMMARY DOCTORAL DISSERTATION: Psychology is facing a “replication crisis”. Many psychological findings could not be replicated in novel samples, which lead to the growing concern that many published findings are overly optimistic or even false. In this dissertation, we investigated potential indicators of problems in the published psychological literature. In Part I of this dissertation, we looked at inconsistencies in reported statistical results in published psychology papers. To facilitate our research, we developed the free tool statcheck; a “spellchecker” for statistics. In Part II, we investigated bias in published effect sizes. We showed that in the presence of publication bias, the overestimation of effects can become worse if you combine studies. Indeed, in meta-analyses from the social sciences we found strong evidence that published effects are overestimated. These are worrying findings, and it is important to think about concrete solutions to improve the quality of psychological research. Some of the solutions we propose are preregistration, replication, and transparency. We argue that to select the best strategies to improve psychological science, we need research on research: meta-research.