New Guide to Program Evaluation Released

Vera Institute

Measuring Success: A Guide to Becoming an Evidence-Based Practice, a new guide from the Vera Institute.

Measuring Success: A Guide to Becoming an Evidence-Based Practice, a new guide from the Vera Institute.

Vera Institute

Measuring Success: A Guide to Becoming an Evidence-Based Practice, a new guide from the Vera Institute.

As funders, programs and the public increasingly understand the importance of evidence based practice, nonprofit leaders are feeling more pressure to prove the effectiveness of their initiatives. To help organizations interested in creating or strengthening a research base for their work, the Vera Institute of Justice recently released a new publication, “Measuring Success: A Guide to Becoming an Evidence-Based Practice.” The guide, which was funded by the MacArthur Foundation as part of its Models for Change initiative, is written primarily for juvenile justice initiatives, but may be helpful for other youth-serving programs as well. [Editor’s note: The MacArthur Foundation is a funder of the JJIE.] Although evaluation is beneficial for any program, the guide cautions that “the steps described here are neither simple nor easy.”

The guide is based on Vera’s experience working with many different types of juvenile justice initiatives at various points in their growth and evaluation processes. It helps organizations prepare for evaluations of their programs, and explains many of the terms and methods used in social science research.

The publication suggests that organizations begin by engaging in a process evaluation, comparing the program’s current operations to the goals, objectives and plans developed for the program at its inception. The next step, according to the authors, is the actual outcome evaluation, which involves identifying a study group, and possibly a control group. The publication also describes different types of quantitative data that could be used to show outcomes (e.g., program attendance or re-arrest rates) and also discusses the use of qualitative data (e.g., quotes taken from surveys of participants) to add another layer to evaluations. It also provides guidance on using the results of outcome evaluations.

Even if a program doesn’t have the resources necessary to engage in a full evaluation of its work, “data collection, monitoring, and reporting are critical for good program planning and pave the way to developing an evaluation capacity” wrote Annie Salsich, director of Vera’s Center on Youth Justice in her foreword to the guide.

Comments are closed.