When I started my job as research director of the Massachusetts Department of Elementary and Secondary Education twelve years ago, I thought my job was to figure out what worked. My agency was just beginning to have access to new and exciting longitudinal data on students and educators. I envisioned that I'd use those data along with strong designs for causal inference to determine which programs and policies were working and which were not. Once we knew those answers, I figured, we would get better policy that would improve outcomes for Massachusetts' students.But in my twelve years in this job, I've learned that the process of improving policy 1 through research is much subtler and more complex than I had initially imagined. Research influences policy more often than much of the academic community thinks, and more frequently every day as we learn how to do this work better. But its influence is less linear than researchers expect, and it is driven as much by relationships and organizational capacity as by the actual information studies produce. Research use operates through conversations, not code; structures in organizations, not standard errors; relationships, not randomized controlled trials.I worry that the growing national efforts to connect research and policy too frequently start from the same "find what works" frame of mind that I did twelve years ago. The "find what works" approach misunderstands the problem of research use as one of lack of information-either lack of information about the impact of a policy or lack of awareness by the policy maker about the available information-and a need for "translation" across sectors (Penuel et al. 2015). This belies the research literature about how research actually plays into the policy decision process. If 1. This essay centers on the influence of research on policy, rather than practice. This is both because I have more expertise in the policy process than I do in issues of direct practice and because AEFP members' research tends to focus more on policy. I suspect, however, that many of the same insights would also apply in practice settings.
State longitudinal data systems (SLDSs) have created more opportunities than ever before for rigorous research to influence education policy decisions. As state practitioners who play central roles in building and using our states' longitudinal data systems, we are excited about their promise for supporting policymaking and research. Yet, we also recognize that the data in SLDSs will not answer many of our most pressing research questions, nor will the presence of these systems create the meaningful collaboration between researchers and practitioners that we feel is needed to inform our states' policy landscapes. The barriers to the kinds of research we need are mostly unrelated to the promises of SLDSs. We outline the challenges we have experienced in developing research agendas, building our internal capacity for research, and working with external partners, and we identify the research questions we need to answer that are not easily addressed with SLDS data.
Education policy makers must make decisions under uncertainty. Thus, how they think about risks has important implications for resource allocation, interventions, innovation, and the information that is provided to the public. In this policy brief we illustrate how the standard of evidence for making decisions can be quite inconsistently applied, in part because of how research findings are reported and contextualized. We argue that inconsistencies in evaluating the probabilities of risks and rewards can lead to suboptimal decisions for students. We offer suggestions for how policy makers might think about the level of confidence they need to make different types of decisions and how researchers can provide more useful information so that research might appropriately affect decision making.
Policy briefs written by academics—the kind typically published in Education Finance and Policy—should be a crucial source of information for policy makers. Yet too frequently these briefs fail to garner the consideration they deserve. Their authors are too focused on the potential objections of their fellow academics, who are concerned with rigor and internal validity, instead of the objections of policy makers, who are concerned with generalizability, understandability, and utility. And researchers too often believe that simply publishing a brief is sufficient to communicate its results. By focusing briefs on topics on the policy agenda, helping policy makers see their constituents in the results, writing clearly, studying implementation and not just outcomes, weighing evidence and drawing conclusions, and reaching out to policy makers beyond publication, researchers have the greatest potential to see their work influence public policy.
The COVID-19 pandemic forced schools to quickly cobble together new remote teaching and learning programs in spring 2020, but now that they have a little more time, they can step back and evaluate the programs they’ve put in place. Nora Gordon and Carrie Conaway describe how school and district leaders can evaluate their online programs without using complex statistics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.