This systematic review investigated the topics studied and reporting practices of published meta‐analyses in educational measurement. Our findings indicated that meta‐analysis is not a highly utilized methodological tool in educational measurement; on average, less than one meta‐analysis has been published per year over the past 30 years (28 meta‐analyses were published between 1986 and 2016). Within the field, researchers have utilized meta‐analysis to study three primary subject areas: test format effects, test accommodations, and predictive validity of operational testing programs. In regard to reporting practices, authors often failed to provide descriptive details of both their search strategy and sample characteristics limiting reproducibility and generalizability of findings, respectively. Furthermore, diagnostic analyses of outliers, publication bias, and statistical power were not provided for the majority of studies, putting into question the validity of inferences made from the meta‐analyses sampled. The lack of transparent and replicable practices of meta‐analyses in educational measurement is a concern for generating credible research syntheses that can assist the field in improving evidence‐based practices. Recommendations are provided for improving training and editorial standards of meta‐analytic research.