This page is optimized for a taller screen. Please rotate your device or increase the size of your browser window.

EVIRATER™: Rating the Strength of Evidence in Evaluations

innovation Abt Global has developed a system to assess the strength of evidence in conclusions drawn by evaluators about the effectiveness of programs or initiatives.

Policymakers, program leaders and philanthropic organizations are increasingly demanding that decision-making about policies and programs be guided by evidence. However, few tools are available to assess the quality of the full range of evidence produced by the spectrum of research designs typically used to evaluate program effects.

Abt Global is responding to this need with EVIRATER™, a system for rating the strength of evidence that is applied to the full range of designs that might be used to evaluate programs. EVIRATER™ expands on current rating systems such as the What Works Clearinghouse and Clearinghouse of Labor Evaluation and Research that are used to review more rigorous research designs such as randomized controlled trials and high quality quasi-experimental studies.

“Many evaluations of federally-funded initiatives as well as foundation-funded grant programs use study designs that are not reviewable in any of the current rating systems,” said Barbara Goodson, EVIRATER™ Developer. “EVIRATER™ supports reviews of evaluations that provide insight on the potential effectiveness of policies and programs.”

What is the EVIRATER rating system?

An EVIRATER™ review assigns strength of evidence rating to each of the impact estimates reported by a study. These ratings categorize the evidence as Strong, Emerging, Limited, Weak, or No evidence.

This rating is based the quality of the outcome measures, design features, and methods of data analysis. EVIRATER™ ratings indicate the extent to which the study findings support causal statements about program effects.

EVIRATER™ allows for review of studies representing a continuum of evidence including both experimental and quasi-experimental studies typically reviewable within existing federally-endorsed rating systems as well as evaluations using interrupted time series and weaker quasi-experimental or one group pre-post designs. EVIRATER™ provides a systematic approach for distinguishing between different types of study designs as well as stronger and weaker applications of various designs.

How has Abt used EVIRATER?

Project LAUNCH: EVIRATER™ – formerly called R-SEED – was first used systematically to address issues emerging from Abt’s role as a cross-site evaluator of state- and community-based evaluations and implementations of Project LAUNCH. Project LAUNCH, funded by SAMHSA, is an effort to promote healthy development and wellness in children from birth to age eight by integrating maternal and child mental and behavioral health into a wide range of community services. Three cohorts of grantees – 24 in all – received five-year grants to achieve these objectives.

The rigor and breadth of the local evaluations varied across the Project LAUNCH grantees, so EVIRATER™ was used to calculate the overall effectiveness of Project LAUNCH in different outcome domains and identify the strength of evidence associated with evidence from the grantees. The Project LAUNCH final outcomes report, which showcases the applicability of EVIRATER™, was submitted to SAMHSA in December 2014.

"The Abt webinar helped us as we developed our evaluation by confirming the design and instruments selected to support a rigorous evaluation."

— Participant in webinar co-hosted by Abt Global for Virginia Math/Science Partnership grantees Math/Science Partnerships (MSP) Annual Conference Presentation, September 2014, Washington, D.C.: Abt Senior Associate Ellen Bobronnikov, project director of Abt’s evaluation of the U.S. Department of Education’s Math Science Partnership (MSP) grant program, proposed describing EVIRATER™ to the state coordinators responsible for distributing funds to MSP grantees, as a potentially useful tool for the coordinators for understanding the strengths of the evaluations proposed by prospective grantees.

In September 2014, Abt’s Catherine Darrow and Todd Grindal presented a summary of the EVIRATER™ system to approximately 20 state coordinators, four contracted evaluators, and Department of Education program staff at the MSP Annual meeting held in Washington, D.C. In this context, EVIRATER™ was introduced as a tool to increase the capacity of state coordinators and their contracted evaluators.

Virginia Math/Science Partnerships (MSP) Webinar for Prospective Grant Applicants, October 2014, Co-hosted by Abt Global and Eric Rhoades, MSP Virginia State Coordinator: Following the MSP Annual Conference presentation, the EVIRATER™ team was contacted by Eric Rhoades – Director of the Virginia Department of Education’s Office of Science and Health Education – and asked if Abt Global could advise the Commonwealth of Virginia on strengthening the research designs proposed by prospective math/science partnership grantees. Rhoades believed EVIRATER™ and associated resources could provide clear-cut guidance to applicants on the variety of methods used to propose and conduct the most rigorous evaluation designs possible. In October, Todd Grindal and Cat Darrow hosted a one-hour webinar with 15 VA MSP applicants and Virginia Department of Education staff.

After the presentation, Rhoades said that Abt provided significant support for the EVIRATER™ webinar. The state received positive feedback via email and noted in the MSP proposals themselves. One comment shared from a proposal was: “The Abt webinar helped us as we developed our evaluation by confirming the design and instruments selected to support a rigorous evaluation.”

Read more about Abt’s work in experimental evaluations, technical assistance and research on evaluation methods
Subscribe to the Evaluation and Methods Digest.

Work With Us
Ready to change people's lives? We want to hear from you.
We do more than solve the challenges our clients have today. We collaborate to solve the challenges of tomorrow.