This guide is designed to help practitioners and researchers work together to design an impact study with baseline equivalence and in turn learning how to determine if an impact study is likely to produce meaningful results. This is part of a series of guides that help practitioners assess their scaling efforts critically, collect evidence on the effectiveness of their interventions, and increase the likelihood of effective scaling of successful interventions.

Scaling Evidence-Based Models (SEBM) Project

The Office of Research and Evaluation (ORE) initiated the Scaling Evidence-Based Models project to support the scaling of effective interventions. This guide is part of ORE’s Scaling Evidence-Based Models project, which includes additional resources that contribute to the study and application of scaling effective interventions. Below are additional scaling resources:

Guides:

  • Scaling an Intervention: Recommendations and Resources: The guide provides five key recommendations that will help funders like AmeriCorps, other government agencies, and philanthropic organizations identify which funded interventions are effective, enhance their knowledge base on scaling them, and pursue scaling.

  • How to Fully Describe an Intervention: This guide is intended to help practitioners to thoroughly describe their intervention and communicate the following to potential funders or stakeholders.

  • Build Organizational Capacity to Implement an Intervention: This guide will help practitioners prepare to implement their desired intervention through building organizational capacity, which involves establishing the organizational structure, workforce, resources, processes, and culture to enable success.

  • How to Structure Implementation Supports: This guide will help practitioners develop formal strategies (also known as implementation supports) to help consistently deliver an intervention as it was designed, which is especially helpful for organizations scaling an intervention and assessing implementation fidelity.

  • Making the Most of Data: This guide will help practitioners maximize the use of their intervention data to help their organizations improve program implementation and provide evidence to funders about effectiveness.

  • What Makes for a Well-Designed, Well-Implemented Impact Study: This guide is intended to help practitioners ensure that their evaluators produce high-quality impact studies.

  • Scaling Programs with Research Evidence and Effectiveness (SPREE): This article focuses on how the foundations can apply the SPREE process and provides insights into conditions that can help identify and support effective interventions that are ready to be scaled.

  • Scaling Evidence-Based Models: Document Review Rubrics: The guide is a two-part rubric for systematically reviewing documents that will help practitioners to identify the critical components of intervention effectiveness and describe plans for scaling the effective intervention.

 

Tools:

  • Scaling Checklists: Assessing Your Level of Evidence and Readiness (SCALER): This report describes a framework that identifies how organizations can improve both their readiness to scale an intervention and the intervention’s readiness to be scaled, so that intervention services are best positioned to improve outcomes for a larger number of participants. Each checklist in the SCALER provides summary scores to reflect how ready an intervention and organization might be for scaling.

Reports:

Case Studies:

Full report

Further information

Program/Intervention
Best Practice Dissemination, AmeriCorps Research Guidance to Scale Programming
Implementing Organization
AmeriCorps Office of Research and Evaluation Commissioned Report
AmeriCorps Program(s)
AmeriCorps State and National
AmeriCorps Seniors
Social Innovation Fund
Office of Research and Evaluation
Focus Population(s)/Community(s)
Opportunity Youth
Schools
Nonprofits
Tribes
Veterans and Military Families
Rural
Suburban
Urban
Low-income
Study Type(s)
Review or Meta-analysis
Study Design(s)
Non-experimental
Researcher/Evaluator
Mathematica
Published Year
2018