JJIE Hub: Key Issues — Evidence-based Practices

In the juvenile justice arena, the term “evidence-based practices” (commonly called “EBPs”) generally refers to programs, practices, and policies that have been rigorously evaluated and shown to be effective at preventing or reducing youth crime.[1]

Programs

Practices and Policies
General practices and broad policies may also be considered evidence-based, if rigorous research supports them, though currently there is much less research on practices and policies than on programs.”[3]

    • By “policies” we are referring to broad strategies to reduce offenses. For example, if wide-scale implementation of a community-based supervision strategy was rigorously evaluated and found to reduce the total incidence of probation revocations, this might be termed an evidence-based policy.
    • By “practices” we are referring to generic practices, such as mentoring and skill development, that may be used by agencies treating youth on their own, as part of generic programs, or in combination with brand-name treatment programs.[4]

Many more programs have been rigorously evaluated to determine their effectiveness than have policies or practices. There are a number of different varieties of programs. Below are some categories of evidence-based programs, and their descriptions:

A.     Intensive Family-based Treatment

B.     Psychosocial Therapies

C.     Medication Therapies

D.    Trauma-informed Care

While there are no standardized, universally accepted criteria that a practice must meet to be classified as evidence-based, below are the criteria many experts consider in making this determination:

    • Has the strategy been shown through rigorous evaluation to prevent youth from committing offenses or to successfully treat youth in trouble with the law? To meet this standard, some experts require that the evaluation studies be performed in both research settings—such as in a controlled, clinical trial—and in real-world settings in the community. Furthermore, the intervention must have repeatedly produced positive outcomes in both settings. More specifically, it must have produced the intended outcome for which it was developed and which it was expected to produce.[5]
    • Does the strategy have standardized, replicable practices that have been scientifically documented to consistently produce the intended positive results?[6]
    • Can the study results showing effectiveness be attributed to the program, as opposed to other circumstances or events?[7]
    • Has the study been subjected to favorable “critical peer review” – meaning it was reviewed by experts in the field other than the people who developed the program, and they found the program to be effective?[8]

Programs that meet many of the standards detailed above may be labeled evidence-based and endorsed by a federal agency or research organization that reviews such programs. Several organizations maintain registries of recommended evidence-based programs, such as Blueprints for Healthy Youth Development, the Office of Juvenile Justice and Delinquency’s Model Programs Guide, and the Office of Justice Programs’ CrimeSolutions.gov website.

Practices labeled as “evidence-based” have been the most rigorously evaluated (see above). However, other strategies may also have indicators of reliability, though they have not been studied enough to be considered evidence-based. Some examples:

    • Research-based
      A program or practice that has some research demonstrating effectiveness. While the type and amount of research needed to qualify in this category varies, in general, to be considered research-based, at least one randomized and/or statistically-controlled research study has been done on a program indicating that it is effective, but it does not yet meet the definition for “evidence-based.”[9]
    • Promising Programs
      Programs that have not been rigorously tested but which, based upon preliminary information such as statistical analyses or well-established theories of change, have the potential to become research-based or evidence-based.[10]
    • Community Defined Evidence
      Practices that have yielded positive results as determined by community consensus, but which have not been measured empirically.[11]
    • Evidence-Generating Policies[12]
      With advance planning, practices can be put in place to generate and collect evidence regarding a new practice’s efficacy as it is being implemented. For example, for many new policies, a modest investment ahead of time could identify reasonable comparison groups that could generate research evidence regarding the policy. A slight delay in program implementation to collect pre-measures could also have a large payoff in terms of research information.
    • Additional advance work helpful to planning evidence-generating practices includes thinking through the intended outcome of a policy; key threats to a policy and feasible approaches to addressing them; and identifying the data available to assess the outcomes and the strengths and weaknesses of that data.

Recidivism
Recidivism is a key measurement for evaluating the effectiveness of juvenile justice programs; recidivism measures whether a youth reoffends.[13] However, recidivism itself is measured in a number of different ways; it can refer to youth at a variety of reconnection points within the juvenile justice system, such as re-arrest, re-conviction, or re-incarceration.[14]

Desistance[15]
“Desistance,” or the cessation of offending, is similar to recidivism, but rather than looking at one particular marker, such as re-arrest, program evaluations focusing on desistance look for gradual, incremental, long-term changes in a young person that is focused on the youth’s asset development as well as a reduction in offenses in both the short and long-term.

For example, evaluations of a community-based after-school program based on desistance may look at intermediate outcome measures of the intervention, such as the youth’s development of competence in a skill they are being taught, development of positive relationships with adult role models, and educational gains, as well as the effects on the youth’s short- and long-term behavior, including any subsequent law violations.

    • The American Prosecutors Research Institute provides an example of this. It developed performance measures for juvenile justice systems that included factors such as community safety, offender accountability, and youths’ competency development, as well as looking at the law-abiding behavior of youth in the justice system. The intermediate outcome measures evaluated factors such as a youth’s community service, participation in school, and resistance to drugs and alcohol.[16]

Targeted Behaviors
Programs targeting a particular issue, such as substance abuse, are used in juvenile justice as well as with other populations, and effectiveness is generally measured by evaluating changes in the behavior being targeted. For example, evaluations of substance abuse treatment programs typically focus on levels of use and abstinence. Some organizations recommend looking at other indicators of treatment success as well, such as employment, mental and physical health, and arrests.[17]

Cost-Benefit
A cost-benefit analysis is a way to calculate society’s return from investing in a program; it helps to answer the question of whether the program is more valuable than other opportunities that could be pursued with the same resources. This type of analysis has garnered increasing attention in recent years, due to concern over the high cost of confining youth.[18]

See the Resources section for further information on WSIPP cost-benefit reports and those of other organizations.

Understanding how programs are evaluated can help to gauge their effectiveness. In addition to considering the criteria used to qualify the program as evidence-based, there also may be differences in how the effectiveness is measured, the type of study done, and the quality of the research, as discussed below.

Statistics and Effect Size
Measurements are often expressed through statistics – for example, Group A in a study reduced recidivism by 50% and Group B by 30%. However, the level of significance that should be associated with this finding is dependent on a number of factors, such as the number of youth in the study. For instance, if there were only ten youth in the above example, the results would not be statistically significant.[21]

In addition to statistical significance, researchers frequently use “effect size” to measure the impact of a juvenile justice intervention. Effect size, in essence, is a method that has been devised to measure the size of the difference between two groups, independent of the sample size, in order to more clearly indicate the change caused by a particular intervention.[22]

Type of Study
In addition to individual evaluations, systematic research reviews and meta-analyses are increasingly being used to assess whether programs, practices, and policies are effective. These techniques allow social scientists to analyze results from a large number of evaluations—sometimes from hundreds of individual studies—and integrate the findings to draw conclusions.[23] The largest example of this type in the juvenile justice field is Mark Lipsey’s analysis of 548 studies of juvenile delinquency interventions performed between 1958 and 2002.[24]

Quality of the Research
In evaluating an evidence-based program, it is important to assess the quality of the research done to measure the program’s effectiveness.

    • Factors to consider include the quality of the researchers, the design of the study, the size of the group studied, and the number of studies with similar results.[25]
    • Some have raised concerns about the vulnerability of reviews of evidence-based research studies to various types of bias including the following: [26]
        • biases that arise in the original study, for example, treatment outcome studies may systematically overestimate or underestimate effects due to design and implementation problems;
        • biases in the reporting, publication, and dissemination of results of studies, such as “confirmation bias” — a tendency to emphasize evidence that supports a hypothesis while ignoring evidence to the contrary; and
        • biases in the review process itself, for example when reviewers sample studies selectively such as by only including published studies.

Cultural Competence
In order to effectively treat youth of diverse racial and ethnic backgrounds, it is important that the evidence-based practice selected is effective with the population of youth served.

    • Concerns have been voiced regarding the lack of cultural diversity in many evidence-based practice clinical trial populations. Some believe this means evidence-based practices have limited utility for communities of color.[27]
    • On the more promising side, several large studies suggest that many commonly used EBPs have comparable and even better results with some youth of color.[28]
    • Information on programs that have been tested with diverse ethnic groups can be found through some websites, such as the SAMHSA NREPP website, though the lists may not always be current.[29]

Research studies of juvenile justice programs have identified not only programs that work well but those programs and practices that are not effective and even harmful. The Washington State Institute for Public Policy (WSIPP) does cost-benefit analyses of many juvenile justice programs and has identified several in which the costs exceed the benefit, and the National Institute of Justice also indicates on their crimesolutions.gov website programs that are not effective. Below are some examples:

    • Formal System Processing
      Formal System Processing refers to “[t]he practice of using traditional juvenile justice system processing in lieu of alternative sanctions to deal with juvenile criminal cases.” Based on a systematic review by Petrosino and colleagues (2010) of 27 studies,[30] the National Institute of Justice’s crimesolutions.gov website rated formal system processing as having no effect on youth, while generally costing more than alternatives and diversion programs.[31]
    • Scared Straight (Juvenile Awareness Programs)
      Scared Straight programs involve taking at-risk youth and/or those involved in the juvenile system on tours of prisons with the overall goal of deterring youth from future criminal behavior through aggressive presentations by inmates intended to “scare” youth out of committing offenses. Studies by WSIPP and a systematic research review by Petrosino and colleagues both found Scared Straight programs were “more harmful than doing nothing” because they were likely to increase recidivism.[32]
    • Youth Automatically Sentenced as Adults
      In Washington state, some youth under the age of 18 are automatically tried as adults. A WSIPP study found that the recidivism rate was higher for youth automatically tried as adults.[33]

If a practice has been labeled as evidence-based, it means it has been proven to be effective, so it offers the greatest degree of reliability in achieving the identified outcome when implemented as directed by the program developer. Some distinct benefits can include:

    • Improving Public Safety
      Public safety is improved through a variety of mechanisms, such as reducing rates of re-arrest, reducing the number of youth confined to secure facilities, decreasing substance abuse, and improving mental health.[34]
    • Improved Outcomes for Youth and Families
      Evidence-based practice interventions, when properly implemented, achieve superior outcomes for youth and families.[35]
    • Saving Money
      Many evidence-based practices have been proven to be cost-effective. In addition to saving communities money, evidence of cost-effectiveness is appealing to government officials, which can help localities secure the resources and support needed to implement these programs.[36]
    • Technical Assistance
      Many evidence-based practices are promulgated and supported by organizations that provide (for a fee) staff training and in-person technical assistance.[37]
    • Community Support
      The demonstrated effectiveness of evidence-based practices can help to garner more community buy-in for the program, which can increase youth participation and retention in the program.[38]

Systematic research reviews have shown that some general evidence-based principles provide guidance on how to intervene most effectively with justice-involved youth to reduce their likelihood of committing new crimes.[39]

Risk, Needs, and Responsivity Principles: Whom to Target, What to Address, and How
Key concepts in intervening with youth are the risk, needs, and responsivity principles, which are described below:

Risk Principle (Whom to Target)

      • Risk refers to the probability that a youth will reoffend based on static factors unlikely to change that are correlated with the risk of re-offense, such as age, age at first arrest, and number of prior arrests.[40]
      • The most intensive intervention programs should be reserved for those at the highest risk of reoffending.[41]

Needs Principle (What to Target)

      • Refers to “criminogenic needs,” also known as dynamic risk factors, which are a youth’s personal and environmental attributes that also are highly correlated with future delinquent offenses, but are more changeable.[42]
      • Examples include anti-social attitudes, negative peer influences, temperament (such as impulsivity), lack of family support, substance abuse, and educational difficulties. [43]
      • Programs should target criminogenic needs and not other needs a youth may have, such as physical conditioning, if they are unrelated to youthful reoffending.[44]
          • Using this approach should help to avoid “net widening,” or keeping youth in the justice system solely to receive treatment for mental health or other issues. It is critical that juvenile justice programs only be used for youth who would otherwise be formally processed within the juvenile justice system. Otherwise, we may be subjecting youth to the negative impacts of formal justice system processing unnecessarily.[45]
          • Note, however, that this approach can run counter to the positive youth justice approach, which places a priority on meeting a youth’s developmental needs, even if they are not directly tied to risk of re-offense.

Responsivity Principle (How to Target)

      • The responsivity or treatment principle is intended to reflect youth’s readiness for change and gauge their ability to respond to particular treatment methods and types of programs.[46]
      • The responsivity principle addresses how treatment should be delivered for the best match of a youth’s individual characteristics, such as culture and cognitive ability, with program characteristics (location, structure, facilitator traits, methodology).[47]
      • Pursuant to this principle, how treatment is delivered is important.
          • Behavioral and cognitive behavioral skill-building techniques have generally been found to be the most effective.[48]
          • Program fidelity, or ensuring that the program is delivered as designed, also matters.[49]

Effective implementation of an evidence-based practice is just as important to a successful outcome as selecting an evidence-based practice and can be a complex process. Whether adopting a particular evidence-based treatment program to use with youth or implementing a new agency-wide evidence-based policy, some of the same challenges are involved, as it is generally difficult for agencies to modify the way they operate.[50]

The Mental Health and Juvenile Justice Collaborative for Change (part of the Models for Change Resource Center Partnership) has developed ten steps and four phases to guide the effective implementation of evidence-based practices. While the guide is focused on evidence-based practices for youth with behavioral health needs in the juvenile justice system, this approach can provide the framework for other types of evidence-based practices for juvenile-justice-involved youth as well.

Below is a brief description of steps necessary for effective implementation, based in part on the work of the Collaborative for Change (though it’s important to note that the process may not always proceed in such a linear fashion):[51]

1.   Lay the Foundation for an Evidence-Based Practice Approach[52]

Change is a long-term process that requires strategic and careful planning.[53] This phase involves a number of activities to both educate and build support among stakeholders for moving to an evidence-based practice approach. The following steps are suggested to lay the groundwork.

        • Build a Network
          Assemble informed and educated stakeholders and decision-makers to help build a culture that understands and is supportive of evidence-based practices.[54]

            • This network can take the form of a steering committee composed of consumers, families, advocates, and community members, as well as the practitioners who will be delivering the service.[55]
            • The Collaborative for Change then recommends developing a “collaborative decision-making body” to help set a clear vision of how the evidence-based practice will support juvenile justice-involved youth and to select and implement the evidence-based practice. [56]
        • Strategic Planning
          A great deal of thought and planning is needed to lay a solid foundation for the introduction of an evidence-based practice, such as determining the following:

            • who the target population is, and what its characteristics are – such as age, gender, race, and cultural specifics;
            • what the outcomes are that you want to achieve for this group of youth;
            • what services are currently being delivered to this population; and
            • what the capacity of your staff and organization is to meet the youth’s needs.[57]
        • Collecting this information and conducting a local needs assessment to identify key resources, gaps, and access points in the community is recommended as the starting point for this process.[58]

2.   Select the Evidence-Based Practice

Selection is based on meeting the needs of the target population while considering the readiness of the community and providers. The following steps are recommended.[59]

        • Education about the Evidence-Based Practices Available
          Those selecting the evidence-based practice (the Collaborative for Change recommends that this be done by the collaborative decision-making body) need to learn about the different types of evidence-based practices available, which work best for particular youth risk factors, and what outcomes are likely to be achieved.[60] They should also learn about other characteristics associated with each evidence-based practice, including implementation and operation costs, staff training needed, and cultural applicability.[61]
        • Assess Readiness[62]
          The community, providers, and agency staff must be ready to commit to the initiative. A variety of factors increase the likelihood of successful adoption of the evidence-based practice within an agency, including:[63]

            • organizational effectiveness — whether leadership is firmly committed to change and to securing the necessary funds, and the agency is stable (low staff turnover rate, financially sound);
            • staff support and qualifications — direct service staff is convinced that change is necessary, committed to long-term change, and hold or are willing to secure the necessary credentials and training; and
            • agreement of staff and stakeholders that the evidence-based practice is the right strategic fit.
        • Make the Choice[64]
          The group selecting the evidence-based practice should synthesize the above information to make a selection that takes into account program match, or how well the evidence-based practice fits the purpose, organization, target population and community; quality of the program; and organizational resources – including administrative, financial, and human resources.[65]

3.   Develop a Plan for Implementation and Sustainability[66]

It is important not to rush the actual implementation phase of the evidence-based practice. Consideration should be given to a number of factors prior to implementation, including:

        • an assessment of all the start-up costs (such as licensing fees, training and supervision costs);
        • ensuring the staff is properly credentialed and trained;
        • setting clear policies and procedures for operating the program – including how to protect youth confidentiality;
        • developing a quality assurance process to determine if the evidence-based practice is resulting in the expected and desired outcomes, including determining what outcomes to measure and what data to collect for this measurement;
        • determining how to ensure fidelity to the practice model; and
        • developing a plan to sustain the evidence-based practice, which will require continued leadership and support, and a source of continued funding, in addition to positive outcomes.

As detailed above, effective implementation of EBPs can be a time-consuming process. Some recommend using or developing resource centers to provide technical assistance to jurisdictions on a variety of the aspects detailed above, including needs assessment, program selection, and training in implementing a program.[67]

Implementing evidence-based practices requires juvenile justice organizations to modify the way they have been operating, and this type of change is never easy: nearly 70 percent of all innovation and implementation initiatives in the public and private sectors fail.[68] Some of the challenges organizations face in implementing evidence-based practices include:

    • Complexity of the Coordination and Implementation Process
      Implementing an evidence-based practice is a long-term process that requires strategic planning to succeed. To be successful, the process often takes two years or more and requires the active involvement of many stakeholders at the state and local level.[69]
    • Leadership and Staff Support
      Strong leadership of the program and key personnel to champion and direct the program are needed for successful implementation. They can help to galvanize strong staff buy-in and support for the program, which is also needed.[70]
    • Staff Training
      Adequate training and technical assistance for staff is needed for them to gain comfort and skill in delivering the service with fidelity to the program. Adequate financial compensation has also been found to be important to staff motivation. Staff support of the program has also been found to be tied to the quality of training and technical assistance.[71]
    • Funding
      While many evidence-based practices have been proven to be cost-effective once in place, initial financial investment is needed to adopt and implement the programs. Many evidence-based practices have substantial start-up costs to purchase the curricula, program materials, and trainings.[72] Current funding mechanisms may also not be structured to encourage or support evidence-based practices.[73]
    • Target Population Match
      Programs are geared to particular populations of youth and their effectiveness is degraded if they are not used for the intended population.[74]
        • Challenges can arise in making this match for a number of reasons, including lack of knowledge about the evidence-based practices available and their “fit” with the local population,[75] and lack of appropriate tools to successfully make this match, such as risk assessment instruments.[76]
        • It is also important to guard against net widening – programs geared to a particular population of youth, such as youth who commit serious offenses, may be working so well that it is tempting to use them for other youth populations — such as youth who skip school — for whom the intervention may not be useful, or may even actively harm. Working with policymakers to establish criteria for inclusion in the program can help to reduce this problem.[77]
    • Mismatch between Intervention and Policy or Legislation
      Administrators or legislative mandates may require a wide or rapid implementation of a new treatment that may not work successfully for more complex programs, which require lengthier time periods to adequately train staff and effectively implement.[78]
    • Matching Program Requirements with Organizational Limitations
      Fidelity to evidence-based program requirements — such as the number and length of treatment sessions — is integral to effectiveness. These requirements can bump up against organizational constraints, such as large caseloads, lack of supervision, and resource limits on the types, frequency, and duration of services.[79]
    • Outcome Monitoring
      Providers may not be used to monitoring and be resistant to it, but it is a necessary component for an effective program, in order to discover problems and develop solutions.[80]

While the numbers are growing, most states have so far only made limited progress in using evidence-based programs and practices.[81]

    • In a 2011 report, Scott Henggeler (the founder of Multisystemic Therapy [MST]) estimated that fewer than 5 percent of eligible youth at high-risk to reoffend in the United States are treated with evidenced-based treatment programs each year.[82]
    • Despite this slow progress, a handful of states are aggressively implementing community-based, evidence-based treatment programs and realizing significant benefits.[83]
    • A 2012 report examined the number of Functional Family Therapy (FFT), Multisystemic Therapy (MST), and Multidimensional Treatment Foster Care (MTFC) “therapist teams” available on a per capita basis and found that New Mexico, Louisiana, Maine, Hawaii, and Connecticut had the highest number of teams, with the availability of the programs averaging more than ten per million individuals in the population.[84]
    • Updated analysis from the Sentencing Project entitled "State Action to Narrow the School to Prison Pipeline" shows only a small number of states are reducing over-policing and zero tolerance in public schools despite a massive allocation of COVID-19-related federal relief funds. According to Part One of the report, few plans addressed concerns with current practices around truancy and not a single state proposed to use ESSER funds to end or curtail the criminalization of routine adolescent behavior at school. Part Two provides specific examples of promising approaches to closing the school-to-prison pipeline described by states in their ARP ESSER plans.

Go here for information on states that have adopted policies and passed legislation to further the use of evidence-based practices in juvenile justice.

It is important to remember that there are limits to the utility of evidence-based practices in achieving better outcomes for youth and communities with respect to the juvenile justice system.

While researchers have made substantial progress in identifying a number of programs and practices that are effective in changing youth behavior, research is far from complete — there is much still to be learned about how to make existing programs more effective, and new approaches to be identified (or which have not been thoroughly evaluated) that may also be effective, more cost-effective, and so on.

Second, although evidence-based research should play a fundamental role in juvenile justice programming, it is important to recognize that there will never be enough funding or time to evaluate every possible practice. Rigorous evaluation often requires substantial resources. Additionally, there are limitations to the predictive power of any research, as the complexity of human behavior means that no study could provide a 100 percent guarantee of effectiveness.[85]

A Developmental Perspective
While research is a tool to inform practice, and evidence-based programs are an excellent way to improve outcomes for youth, successful reform will be elusive without a concomitant rethinking of the values and principles on which the juvenile justice system rests. In Reforming Juvenile Justice, the National Research Council of the National Academies of Science recommends that “the core principles guiding the way that both less serious and more serious juvenile offenders are treated should flow from a developmental perspective.”[86]

    • The National Research Council recommends interventions that provide opportunities for an adolescent to develop successfully in a supportive social world. [87]
    • This includes model programs, such as those discussed above, as well as considering what might be impeding developmental processes. Over-reliance on institutionalization is an example of a practice that could impede positive adolescent development.[88]
    • Also recommended is more focus on mechanisms to promote positive development.[89]

“Positive Youth Justice” (PYJ)
The “positive youth justice” approach uses decades of theoretical and empirical research on adolescent development to rethink how juvenile justice services are conceived and delivered. While not yet an evidence-based policy, the framework focuses on meeting key developmental needs of youth in the justice system to achieve superior outcomes.

The model — currently being tested and refined in several communities -- is strength-based rather than deficit-oriented. It builds on recent studies of adolescent development which have determined that most youth are resilient, meaning they are able to thrive and develop even in the face of multiple risk factors. The PYJ model builds on this youthful resilience by supporting positive youth development through encouraging youth to develop useful skills and competencies, and build stronger connections with pro-social peers, families, and communities.[90]

The model requires policymakers, program developers, and practitioners to focus on two core assets:[91]

1.   Learning/Doing

        • Developing new skills and competencies
        • Actively using new skills
        • Taking on new roles and responsibilities
        • Developing self-efficacy and personal confidence

2.   Attaching/Belonging

        • Becoming an active member of pro-social groups
        • Developing and enjoying a sense of belonging
        • Placing a high value on service to others
        • Being part of a larger community

Evidence-based Practices Sections

Hub Topic Button_KeyIssues_Black_text on grey background Hub Topic Button_ReformTrends_Black_text on grey background
Hub Topic Button_Resources_Black text on grey background Hub Topic Button_Experts_Black_text on grey background
Hub Topic Button_Glossary_Black text on grey background

Notes

[1] See "What's the evidence for evidence-based practices?" Jeffrey A. Butts, accessed August 24, 2018; “Evidence-Based or Evidence-Informed,” Evidence Generation, accessed August 24, 2018; National Center for Mental Health and Juvenile Justice Collaborative for Change, “Evidence-Based Practices,” accessed September 24, 2018.

[2] When researchers review a generic type of program (e.g., “family therapy”) or practice, they are necessarily looking at a broad range of versions of that program, which means that there will be variability in their effectiveness; the challenge is to identify common elements that contribute to and maximize effectiveness. Mark W. Lipsey, et. al., “Improving the Effectiveness of Juvenile Justice Programs,” (Center for Juvenile Justice Reform, December 2010), 19-20.

[3] Henninger, et. al, “Evidence-Based,” accessed April 7, 2014.

[4] “Evidence-Based/Informed,” Evidence Generation, John Jay College of Criminal Justice, accessed June 24, 2014.

[5] John A. Morris, Stephen Day, and Sonja K. Schoenwald, "Turning Knowledge into Practice," 2nd ed. (revised), (Boston, MA: The Technical Assistance Collaborative, Inc., Winter 2010), 15-16.

[6] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices - Overview: EBPs Defined,” accessed April 4, 2014, and Models for Change, “Evidence-Based Programs for Juvenile Justice Reform in Louisiana” (Chicago: The John D. and Catherine T. MacArthur Foundation, February 2010).

[7] Siobhan M. Cooney, Mary Huser, Stephen Small, and Cailin O’Connor, “Evidence-Based Programs: An Overview,” What Works, Wisconsin – Research to Practice Series, 6 (Madison, WI: University of Wisconsin-Madison and University of Wisconsin-Extension, 2007), 2.

[8] Cooney, et. al., “Evidence-Based Programs: An Overview,” 2.

[9] Henninger, et. al, “Evidence-Based,” accessed April 7, 2014; Washington State Institute for Public Policy, “Updated Inventory of Evidence-based, Research-based, and Promising Practices” (January 2014): 3, ; “Program Review and Rating from Start to Finish,” CrimeSolutions.gov, National Institute of Justice, accessed June 27, 2014.

[10] Henninger, et. al, “Evidence-Based,” accessed April 7, 2014; Washington State Institute for Public Policy, “Updated Inventory of Evidence-based, Research-based, and Promising Practices” (January 2014): 3

[11] James Bell, “Non-Judicial Drivers into the Juvenile Justice System for Youth of Color,” (W. Haywood Burns Institute, undated): 9.

[12] Akiva M. Lieberman, “Advocating Evidence-Generating Policies: A Role for the ASC,” The Criminologist, Vol. 34, #1 (January/February 2009).

[13] See, for example, National Research Council, Reforming Juvenile Justice: A Developmental Approach (Washington, DC: The National Academies Press, 2013)139-144; Mark W. Lipsey, James C. Howell, Marion R. Kelly, Gabrielle Chapman, and Darin Carver, “Improving the Effectiveness of Juvenile Justice Programs,” (Washington, D.C.: Center for Juvenile Justice Reform, December 2010), 22-3.

[14] Phillip W. Harris, Brian Lockwood, Liz Mengers, and Bartlett H. Stoodley, “Measuring Recidivism in Juvenile Corrections,Journal of Juvenile Justice, Vol. 1, Issue 1 (Washington, D.C.: Office of Juvenile Justice and Delinquency Prevention, Fall 2011).

[15] William H. Barton, G. Roger Jarjoura, and Andre B. Rosay, “Applying a Developmental Lens to Juvenile Reentry and Reintegration,” Journal of Juvenile Justice, Vol. 1, Issue 2 (Washington, DC: Office of Juvenile Justice and Delinquency Prevention, Spring 2012).

[16] American Prosecutors Research Institute, “Juvenile Prosecution Policy Positions and Guidelines”, accessed September 8, 2018.

[17] The Iowa Practice Improvement Collaborative Project, “Evidence-Based Practices: An Implementation Guide for Community-Based Substance Abuse Treatment Agencies” (Iowa City, IA: The Iowa Consortium for Substance Abuse Research and Evaluation, Spring 2003): 26-27

[18] Operating costs per youth for a year’s stay in a juvenile justice facility can exceed $100,000. National Juvenile Justice Network and Texas Public Policy Foundation, “The Comeback States,” (2013): 6-7.

[19] National Research Council, Reforming Juvenile Justice: A Developmental Approach, 168.

[20] ”Inventory of Evidence-Based, Research-Based, and Promising Practices for Prevention and Intervention Services for Children and Juveniles in Child Welfare, Juvenile Justice, and Mental Health Systems” (Olympia, WA: Washington State Institute for Public Policy: June 2013), ; Steve Aos, Marna Miller, and Elizabeth Drake, “Evidence-Based Public Policy Options to Reduce Future Prison Construction, Criminal Justice Costs, and Crime Rates” (Olympia, WA: Washington State Institute for Public Policy, 2006)

[21] Jeffrey A. Butts, “Effect Size,” Evidence Generation, accessed September 24, 2018.

[22] Effect size is generally calculated as the [mean of the experimental group] – [mean of the control group] divided by standard deviation. Robert Coe, “It’s the Effect Size, Stupid” (Paper presented at the Annual Conference of the British Educational Research Association, University of Exeter, England: September, 2002); Jeffrey A. Butts, “Effect Size,” Evidence Generation, accessed April 7, 2014; Jeffrey A. Butts, “What’s the Evidence for Evidence-Based Practice?” Research and Evaluation Data Bits (New York, NY: Research and Evaluation Center, John Jay College of Criminal Justice, City University of New York, 2012).

[23] “The Meta Analysis of Research Studies,” ERIC Clearinghouse on Assessment and Evaluation, Department of Measurement, Statistics, and Evaluation, University of Maryland, College Park, accessed April 8, 2014.

[24] Lipsey, et. al., 21-22.

[25] John A. Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 38-9.

[26]This information is from a report by Julia H. Littell in which she explored the sources and types of bias to which published reviews are vulnerable by examining 37 published reviews of MST and found that many reviews were influenced by a number of biases. Julia H. Littell, “Evidence-Based or Biased? The Quality of Published Reviews of Evidence-Based Practices,” Children and Youth Services Review 30 (Elsevier Ltd. 2008): 1313, 1299-1317.

[27] Sarah Cusworth Walker, Eric Trupin, and Jacquelyn Hansen, “A Toolkit for Applying the Cultural Enhancement Model to Evidence-Based Practice,” (Division of Public Behavioral Health & Justice Policy, University of Washington and The John D. and Catherine T. MacArthur Foundation Models for Change initiative, Nov. 22, 2013), 5; W. Haywood Burns Institute, “Non-Judicial Drivers into the Juvenile Justice System for Youth of Color,” (2011): 9.

[28] John A. Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 28; Jeanne Miranda, Guillermo Bernal, Anna Lau, Laura Kohn, Wei-ChinHwang, and Teresa LaFromboise, “State of the Science on Psychosocial Interventions for Ethnic Minorities, “ Annual Review of Clinical Psychology 1 (2005): 113–142; Stanley J. Huey, Jr., and Antonio J. Polo, “Evidence-Based Psychosocial Treatments for Ethnic Minority Youth,” Journal of Clinical Child and Adolescent Psychology 37(1) (Jan. 2008): 262-301

[29] John A. Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 28.

[30] Anthony Petrosino, Carolyn Turpin-Petrosino, and Sarah Guckenburg “Formal System Processing of Juveniles: Effects on Delinquency,” Campbell Systematic Reviews (2010) 1,

[31] National Institute of Justice, crimesolutions.gov, “Practice Profile: Formal System Processing for Juveniles,” accessed July 29, 2014

[32] National Institute of Justice, crimesolutions.gov, “Practice Profile: Juvenile Awareness Programs (Scared Straight),” accessed July 31, 2014, ; Steve Aos, Polly Phipps, Robert Barnoski, and Roxanne Lieb, “The Comparative Costs and Benefits of Programs to Reduce Crime” (Olympia, WA: Washington State Institute for Public Policy, 2001), at ; Anthony Petrosino, John Buehler, and Carolyn Turpin-Petrosino, “‘Scared Straight’ and Other Juvenile Awareness Programs for Preventing Juvenile Delinquency: a Systematic Review,” Campbell Systematic Reviews (2013), 7.

[33] Elizabeth Drake, “The Effectiveness of Declining Juvenile Court Jurisdiction of Youthful Offenders,” (Olympia, WA: Washington State Institute for Public Policy, December 2013),

[34] Stephen Phillippi, Laquinta Below, and Damien Cuffie, “Evidence-Based Practices for Juvenile Justice Reform in Louisiana” (New Orleans, LA: Louisiana State University School of Public Health & Louisiana Models for Change in Juvenile Justice, January 2010), v.

[35] EBPs by definition have a considerable amount of research evidence to indicate their use improves outcomes. See the Resources section, “Using Evidence-Based Treatment Programs,” for further information and registries of many evidence-based programs.

[36] Cooney, et. al., “Evidence-Based Programs: An Overview,” 3-4.

[37] Cooney, et. al., “Evidence-Based Programs: An Overview,” 3.

[38] Cooney, et. al., “Evidence-Based Programs: An Overview,” 3-4.

[39] See Pennsylvania Commission on Crime and Delinquency, “Pennsylvania’s Juvenile Justice System Enhancement Strategy,” (April 2012): 8; see also Lipsey, et. al., 12.

[40] Pennsylvania Commission on Crime and Delinquency, 8; Randy Borum, “Managing At-Risk Juvenile Offenders in the Community: Putting Evidence-Based Principles Into Practice,” Journal of Contemporary Criminal Justice, Vol. 19, No. 1 (February 2003): 116-117, 

[41] Edward J. Latessa and Christopher Lowenkamp, “What Works in Reducing Recidivism?” University of St. Thomas Law Journal, Vol. 3:3 (2006): 522.

[42] Pennsylvania Commission on Crime and Delinquency, 116-117.

[43] Pennsylvania Commission on Crime and Delinquency, 116-117.

[44] Latessa & Lowenkamp, “What Works in Reducing Recidivism?” 523.

[45] Models for Change, “Guide to Developing Pre-Adjudication Diversion Policy and Practice in Pennsylvania,” (Chicago, IL: John D. and Catherine T. MacArthur Foundation, September 2010), 8.

[46] Chris Baird, Theresa Healy, Kristen Johnson, Andrea Bogie, Erin Wicke Dankert, and Chris Scharenbroch, “A Comparison of Risk Assessment Instruments in Juvenile Justice (National Council on Crime and Delinquency: August 2013): i; Pennsylvania Commission on Crime and Delinquency, 8.

[47] Pennsylvania Commission on Crime and Delinquency, 8.

[48] Pennsylvania Commission on Crime and Delinquency, 8; Latessa & Lowenkamp, “What Works in Reducing Recidivism?” 523-5.

[49] Pennsylvania Commission on Crime and Delinquency, 8; Latessa & Lowenkamp, “What Works in Reducing Recidivism?” 523-5.

[50] “Resistance to the adoption of evidence-based practice and systems of care is well recognized in the literature on program implementation,” Mark W. Lipsey, et. al., "Improving the Effectiveness of Juvenile Justice Programs," 48.

[51] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices - Guidance from the Field,” accessed April 16, 2018.

[52] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Phase 1: Laying the Groundwork,” accessed April 16, 2018.

[53] Pennsylvania Commission on Crime and Delinquency, 11.

[54] National Center for Mental Health and Juvenile Justice and Louisiana State University Health Sciences Center, “Louisiana Models for Change: Fostering a Movement Towards Evidence-Based Screening, Assessment, and Treatment” (undated): 3.

[55] Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 61.

[56] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Three: Establish a Collaborative Decision-Making Body,” accessed April 16, 2018; Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Four: Set the Vision,” accessed April 16, 2018.

[57] Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 59; Models for Change, “Evidence-Based Programs for Juvenile Justice Reform in Louisiana” (Chicago: The John D. and Catherine T. MacArthur Foundation, February 2010), 15.

[58] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Two: Conduct a Needs Assessment,” accessed April 16, 2018; National Center for Mental Health and Juvenile Justice and Louisiana State University Health Sciences Center, “Louisiana Models for Change: Fostering a Movement Towards Evidence-Based Screening, Assessment, and Treatment” (undated): 3.

[59] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Phase II: Choosing Evidence-Based Practices,” accessed April 16, 2018.

[60] Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 61.

[61] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Five: Review EBPS,”.

[62] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Six: Assess Readiness,” accessed April 16, 2018.

[63] Stephen Phillippi, Jr., Laquinta Below, and Damien Cuffie, “Evidence-Based Practices for Juvenile Justice Reform in Louisiana” (New Orleans, LA: Louisiana State University School of Public Health & Louisiana Models for Change in Juvenile Justice, February 2010), 16-17; Pennsylvania Commission on Crime and Delinquency, 12.

[64] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Seven: Select an Appropriate EBP,” accessed April 16, 2018.

[65] Morris, et. al., "Turning Knowledge into Practice," 2nd ed. (revised), 62.

[66] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Eight: Implement the Selected EBP,” accessed April 16, 2018; Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step Nine: Institute a Quality Assurance Process,” accessed April 16, 2018; Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Phase IV: Sustaining Evidence-Based Practices,” accessed April 16, 2018.

[67] Evidence-Based Practice Resource Center, Substance Abuse and Mental Health Services Administration.

[68] Pennsylvania Commission on Crime and Delinquency, “Pennsylvania’s Juvenile Justice System Enhancement Strategy” (April 2012): 11-12.

[69] Peter W. Greenwood, Brandon C. Welsh, and Michael Rocque, “Implementing Proven Programs for Juvenile Offenders” (Association for the Advancement of Evidence-Based Practice, December 2012): 10; Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step One: Form a Steering Committee,” accessed April 16, 2018.

[70] Sharon Mihalic, et.al., “Blueprints for Violence Prevention” (Boulder, CO: Center for the Study and Prevention of Violence, University of Colorado, July 2004): 103-4, 107,

[71] Mihalic, et. al., “Blueprints for Violence Prevention,” 104-5.

[72] Cooney, et. al., “Evidence-Based Programs: An Overview,” 4.

[73] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step One: Form a Steering Committee – Challenges Associated with EBPs.”

[74] Scott W. Henggeler and Sonja J. Schoenwald, “Evidence-Based Interventions for Juvenile Offenders and Juvenile Justice Policies that Support Them,Social Policy Report, Vol. 25, No. 1 (2011): 11.

[75] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step One: Form a Steering Committee.”

[76] Lipsey, “Improving the Effectiveness of Juvenile Justice Programs,” 40.

[77] Henggeler, et. al., 11.

[78] Henggeler, et. al., 10-11.

[79] Lipsey, “Improving the Effectiveness of Juvenile Justice Programs,” 36.

[80] Mental Health and Juvenile Justice Collaborative for Change, “Implementing Evidence-Based Practices – Step One: Form a Steering Committee;” Sharon Mihalic, et. al., “Blueprints for Violence Prevention,” 114-15.

[81] Dick Mendel, “Only a Few States Enact Evidence-Based Care for Troubled Youth,” The Center for Public Integrity, Juvenile Justice Information Exchange, July 3, 2013.

[82] Scott W. Henggeler and Sonja J. Schoenwald, “Evidence-Based Interventions for Juvenile Offenders and Juvenile Justice Policies that Support Them,Social Policy Report, Vol. 25, No. 1 (2011): 8.

[83] Dick Mendel, “Only a Few States Enact Evidence-Based Care for Troubled Youth,” The Center for Public Integrity, Juvenile Justice Information Exchange, July 3, 2013; Peter W. Greenwood, Brandon C. Welsh, and Michael Rocque, “Implementing Proven Programs for Juvenile Offenders” (Association for the Advancement of Evidence-Based Practice, December 2012) accessed October 22, 2018.

[84] Greenwood, Welsh, and Rocque, 16-17.

[85] Jeffrey A. Butts, “What’s the Evidence for Evidence-Based Practice?Research and Evaluation Databits (John J. College of Criminal Justice Research and Evaluation Center, New York, NY: Oct. 5, 2012).

[86] National Research Council, "Reforming Juvenile Justice: A Developmental Approach," (Washington, DC: The National Academies Press, 2013): 177.

[87] "Reforming Juvenile Justice: A Developmental Approach,"179.

[88] "Reforming Juvenile Justice: A Developmental Approach," 179.

[89] "Reforming Juvenile Justice: A Developmental Approach," 180.

[90] Jeffrey A. Butts, Gordon Bazemore, and Aundra Saa Meroe, “Positive Youth Justice: Framing Justice Interventions Using the Concepts of Positive Youth Development,” (Washington, DC: Coalition for Juvenile Justice, 2010): 9.

[91] Butts, et. al., “Positive Youth Justice,” 16.