Creativity is widely considered a skill essential to succeeding in the modern world. Numerous creativity training programs have been developed, and several meta-analyses have attempted to summarize the effectiveness of these programs and identify the features influencing their impact. Unfortunately, previous meta-analyses share a number of limitations, most notably overlooking the potentially strong impact of publication bias and the influence of study quality on effect sizes. We undertook a meta-analysis of 169 creativity training studies across 5 decades (844 effect sizes, the largest meta-analysis of creativity training to date), including a substantial number of unpublished studies (48 studies; 262 effect sizes). We employed a range of statistical methods to detect and adjust for publication bias and evaluated the robustness of the evidence in the field. In line with previous meta-analyses, we found a moderate training effect (0.53 SDs; unadjusted for publication bias). Critically, we observed converging evidence consistent with strong publication bias. All adjustment methods considerably lowered our original estimate (adjusted estimates ranged from 0.29 to 0.32 SDs). This severe bias casts doubt on the representativeness of the published literature in the field and on the conclusions of previous meta-analyses. Our analysis also revealed a high prevalence of methodological shortcomings in creativity training studies (likely to have inflated our average effect), and little signs of methodological improvement over time-a situation that limits the usefulness of this body of work. We conclude by presenting implications and recommendations for researchers and practitioners, and we propose an agenda for future research.
Public Significance StatementCreativity is considered an essential skill in many contexts, leading to the development of numerous training programs aimed at improving this skill. We examined 5 decades of creativity training studies (169 studies). Using a range of meta-analytic techniques, we found converging evidence consistent with publication bias. Moreover, our analysis revealed a large number of studies with methodological shortcomings. Both of these findings suggest that creativity training may be less effective than previously thought. In view of these findings, we proposed a set of recommendations to assist researchers and practitioners in interpreting findings from the creativity training literature, and we outlined directions for future research to increase the informativeness of creativity training studies.