Systematic reviews provide an exhaustive summary of literature relevant to a particular question or topic, using an objective and transparent approach for research synthesis. This method of reviewing existing research and determining overall impacts has become increasingly popular in international development research. Mathematica has extensive experience conducting systematic reviews and assessing the impacts of development and humanitarian interventions.
For the School Dropout Prevention Pilot Project, we worked with our partners Creative Associates and School to School to develop a methodology on how to review existing U.S. and international evidence regarding interventions designed to prevent student dropout based on the What Works Clearinghouse model. We then designed the pilot project interventions to maximize strategies for which there was rigorous evidence, coupled with an in-depth assessment of local dropout trends and issues in each of the four countries.
Mathematica has extensive expertise conducting systematic reviews in a number of settings. The What Works Clearinghouse (WWC) for the U.S. Department of Education promotes the use of research to inform education decision making. It operates through a web-based dissemination system centered on systematic reviews of research on the effectiveness of educational interventions. The WWC also develops research standards and its staff train and certify reviewers on the process of conducting systematic reviews, its evidence standards, and its ratings of effectiveness.
Other examples of systematic reviews include a thorough review of the home visiting research literature and providing an assessment of the evidence of effectiveness for home visiting programs that target families with pregnant women and children ages birth to 5. The review assessed the quality of research studies and evaluated the strength of evidence for specific home visiting program models. In our comprehensive review of the evidence base for programs to prevent teen pregnancy, we screened studies against inclusion criteria, assigned each included study a quality rating on the rigor and thorough execution of the research design, and grouped programs into evidence categories. The federal government used the results of this review to identify pregnancy prevention programs backed by evidence of effectiveness and thus eligible for priority grant funding as part of a major national policy initiative.