This image features a young boy jumping in the air, in front of a blue wall with a red rocketship painted on it.

Unlocking the Potential of the “What Works” Approach to Policymaking and Practice: Improving Impact Evaluations

The emphasis on using well-designed impact evaluations comes from a justifiable concern that too many prior evaluations of the effects of policies and programs used methods that generated debate. Over the past 20 years, the research and evaluation community, supported by the Institute of Education Sciences (IES) within the U.S. Department of Education, research units within the U.S. Department of Health and Human Services (HHS) and the U.S. Department of Labor (DOL), and various foundations, has made great progress fixing that problem. For example, in 2008, when the IES Board filed a report on the Institute's first five years (National Board for Education Sciences, 2008), it applauded the improved rigor of impact evaluation work in education (e.g., better study designs, analysis methods, measures). The Board also called for increased attention to relevance (i.e., ensuring that IES studies answer questions of importance to policymakers and practitioners). In doing so, it noted that the research community had made more progress on designing impact studies to assess "what works for whom", than it had on the issues of "under what conditions and why.”  This paper suggests three ways to make impact evaluations more relevant to policy and practice:  (1) Emphasize learning from all studies over sorting out winners and losers; (2) collect better information on the conditions that shape an initiative's success or failure; and (3) learn more about the features of programs and policies that influence effectiveness. Implementing each of the recommendations will improve the fundamental understanding of social problems, while also generating practical guidance for mitigating those problems—guidance that may prove critical to unlocking the potential of the what works approach. Implementing these recommendations, however, adds costs to evaluations that practitioners and policymakers already see as expensive. Later in this article, we offer suggestions for finding the necessary resources through greater efficiencies in other areas of data collection.

Access PDF: http://journals.sagepub.com/doi/abs/10.1177/1098214015594420

 

Citation: Granger, R. & Maynard R.A. (2015) Unlocking the Potential of the "What Works" Approach to Policymaking and Practice:  Improving Impact Evaluations.  American Journal of Evaluation (pp. 558-569). DOI: 10.1177/1098214015594420

Published by