It seemed a throwback to the days of the country doctor: Go to the patients instead of having them come to you.
As a young intern in the pediatrics department at the University of Virginia’s medical school in the mid-1970s, Scott Henggeler got that advice from his supervisor, a social worker on staff.
He heeded it, taking the department’s van out for house calls into the natural beauty of the Shenandoah Valley in the Charlottesville area and soon had an epiphany about the folly of trying to treat some of the most troubled youngsters in an office setting.
“I visited probably about six, seven homes, and in each case, all it really took was to just set foot inside the door and you realized how goofy your academic treatment plan was,” Henggeler told JJIE. “Doing the home-based stuff just removed the barriers, really removed most of the barriers and helped you better engage with the families, but also very importantly, you got much more accurate assessment data.
“When you understand better, and there’s really nothing better than sitting in someone’s living room for this, when you understand the real-life context of folks – who’s living at the house, what people are like, what their life is like – it helps you develop better and more accurate treatment plans,” he said. “And it also gives you better outcome data because you’re there seeing if the change has actually happened vs. bringing them into a clinic to report what’s happening.”
Henggeler and his colleagues ultimately expanded treatment not only to include juveniles’ homes and families but also their schools, teachers, neighborhoods and peers.
By the early 1990s, the clinicians named their pioneering approach “Multisystemic Therapy” (MST) – focusing on the “systems” in a child’s life, and after numerous trials, MST expanded rapidly. Today, MST – which is affiliated with the for-profit company MST Services, based at the Family Services Research Center at the Medical University of South Carolina in Charleston, S.C. – operates in 34 states, the District of Columbia and 14 other countries.
Multisystemic Therapy stands out as one of the best-known and most thoroughly researched among what’s referred to in juvenile justice circles as “evidence-based practices” (EBPs) – programs and practices that have been shown through research to prevent or reduce juvenile crime and help youths make better choices.
But the EBP designation is both coveted and controversial.
Purists maintain only programs subjected to the highest scrutiny should be employed in juvenile justice, viewing them as akin to the gold standard in medicine.
Skeptics acknowledge programs designated EBPs can be very effective, but say limiting funding to such programs cuts off too many promising homegrown and grassroots programs that aren't big brand names in the field.
In juvenile justice, the term evidence-based practices can be traced at least until the mid-1990s, when randomized, controlled trials – modeled after those used by the U.S. Food and Drug Administration when reviewing applications for new drugs – tested the effectiveness of particular juvenile justice programs.
Being deemed an EBP is no small matter. In requests for proposals, numerous states, counties, the federal government and private foundations specify that a reform effort must be qualified as an EBP to receive funding.
And getting the EBP designation comes with no small price tag. Some experts peg the cost of a randomized, controlled study of a juvenile justice program at anywhere from $300,000 to $5 million.
Seven national registries, three of them federal, bestow the EBP designation, using different labels for it. But the registries vary widely in the quantity and rigor of supporting evidence they require. All told, at least 450 programs addressing youth behaviors including juvenile crime have been given the stamp of approval as being evidence-based by at least one of the registries.
The Gold in Blueprints
By all accounts, the registry with the most stringent review standards in juvenile justice programs (and now, other programs serving youths) is maintained by Blueprints for Healthy Youth Development, based at the Institute of Behavioral Science at the University of Colorado, Boulder. The Baltimore-based Annie E. Casey Foundation funds Blueprints.
The Colorado institute’s Center for the Study and Prevention of Violence started Blueprints in 1996 (as Blueprints for Violence Prevention) to help identify and replicate violence, delinquency and drug prevention programs that have been proven effective.
Delbert Elliott, a University of Colorado sociology professor emeritus and the founding director of the Center for the Study and Prevention of Violence, said Blueprints grew out of frustration with how Colorado had been making decisions on funding juvenile justice programs in the 1990s.
“They were made on who you know, your political contributions, because the committee that was making these selections were all politicians,” Elliott told JJIE. “What was surprising to me was that nobody was asking the question about what we know about whether this works. Is there some research evidence that this program has proven to be effective, that we know if you put kids in this program you’re going to get some positive effects?”
Each of the 1,300 programs evaluated by Blueprints has been reviewed by an independent panel of evaluation experts, and only nine juvenile justice-related programs have received the “model” program rating, while 23 have been rated “promising.” To qualify as a model program, a program must be evaluated in at least one randomized, controlled trial, positive results must be sustained for at least a year after the intervention ends, and the program must be capable of being replicated elsewhere. (Elliott says programs should be rated model to be implemented statewide or nationally but that the promising rating suffices for use at the local level.)
“You have to be really careful when we’re investing public monies in these kind of programs; you have to be pretty sure that you’re going to get positive effects,” Elliott said.
He noted some juvenile justice programs, like Scared Straight, have been proven not only ineffective, but harmful.
“We don’t have a Hippocratic Oath that we require for the developers of these programs,” Elliott said. “So the whole evidence-based movement is a way of saying we should be investing in …. programs that work.”
Clay Yeager, a senior consultant for the Washington-based Evidence-Based Associates, which helps jurisdictions implement and manage EBPs, applauded the Blueprints registry.
Yeager, who is based in York, Pa., and had served as the director of Pennsylvania’s Office of Juvenile Justice and Delinquency Prevention from 1997 to 2002, said he advocates randomized, controlled trials as the gold standard for juvenile justice programs.
“If a standard is good enough for us to utilize when prescribing medication, why should we have a lower standard for programs addressing problems in kids and families and communities?” he said.
Not surprisingly, Yeager said he’s wary of the half-dozen registries besides Blueprints. Their EBP designations, he said, have been diluted to the point that they have “lost their meaning and value” in assessing the potential effectiveness of programs.
“It’s created a great deal of confusion among the consumers out there – and by that I mean public policy officials, practitioners, not being able to discern clearly the differences among the standards that qualify each of these 450-plus programs as meeting the test for being evidence-based,” Yeager told JJIE.
“There have been so many iterations and variations on this singular theme of being evidence-based, and it’s minimized the value of what that term was originally intended to mean,” Yeager said. “It’s a disservice to the kids, to the families, to the taxpayers ... to not have more clarity about what constitutes evidence-based programs, what constitutes the definition of the evidence-based.”
Evidence-Based Associates has helped jurisdictions implement and manage MST and other Blueprints model programs, including Functional Family Therapy, which involves intensive work with troubled youths and their families.
Yeager notes the U.S. spends $6 billion to $7 billion a year on locking kids up, an average of $88,000 per youth per year, despite a dearth of evidence it’s effective.
By contrast, MST costs an average of about $5,100 to about $11,950 per family in the U.S. In the course of an average of four months of treatment, therapists make multiple contacts in the family’s home and surrounding community.
MST points to research showing the treatment reduced long-term re-arrest rates by as much as 25 percent to 70 percent and sliced out-of-home placements by as much as 47 percent to 64 percent. The therapy also led to decreased substance abuse, fewer mental health problems for serious juvenile offenders and much better family functioning, according to MST.
FFT Traces Roots to Home Visits in Watts
FFT costs an average of about $3,500 per family, and each family is typically seen both in office and home settings 12 to 15 times over a three- to five-month period
FFT founder James Alexander, a psychologist and adjunct research professor of clinical psychology at the University of Utah in Salt Lake City, traces his interest in helping troubled youths and their families to his days of doing home visits as a social worker in the Watts section of Los Angeles just before the 1965 riots.
“I immediately was thrust into issues of race, culture and, of course, a lot of violence, family disruption, all of those kinds of things, so that’s when I started getting the bug to want to do something about it,” Alexander told JJIE.
“We’re trying to help people become adaptive and effective and functional in ways that work for them.”
FFT, based in Seattle, now operates in 45 U.S. states and 10 other countries. FFT has been shown to significantly reduce recidivism and to help youths overcome delinquency, substance abuse and violence.
While EBPs have received considerable support, not everyone views the gold standard of evidence-based approaches as proof programs are likely to be effective.
Worthy Programs Left Out?
Jeffrey Butts, the director of the Research and Evaluation Center at John Jay College of Criminal Justice at the City University of New York, said though being evidence-based does not guarantee good results, many act as if it does.
“In a perfect world, where everyone was smart and read everything they could and paid attention and were students of the field and took their job seriously, it wouldn’t be a problem because people would realize that there are multiple sources of information,” Butts said in an interview.
“But in a dumb, clumsy and attention-deficit world, it’s a problem because someone says, ‘Hey, I’m on the Appropriations Committee in Indiana or Nebraska or something and we need a community-based program. Which one should we go with?’ And some 22-year-old on the staff of the committee says, ‘I’ll go to this [EBP registry] website and look it up and find you a good program.’
“In that world, it’s a problem because that 22-year-old staffer talking to the insurance salesman who just got elected to the state legislature is not very sophisticated, and they do go for the easy, seemingly simple answer of these programs registries.”
In an e-mail to JJIE, Butts acknowledged brand-name programs like MST and FFT have merit in treating youths they’re designed to treat, but said, “We need a full menu of options, and we need to continue to develop new options by evaluating new models and new intervention concepts.”
Shaena Fazal, the national policy director for the nonprofit, Washington-based Youth Advocate Programs Inc. (YAP), noted it is not deemed an EBP by some registries, but said YAP has a strong evidence base: Its effectiveness has been documented in 10 external studies and recognized by the Casey Foundation, the federal Office of Juvenile Justice and Delinquency Prevention, the National Council on Crime & Delinquency and John Jay’s Research & Evaluation Center.
YAP provides community-based alternatives to out-of-home placements in 17 states, with interventions including intensive support for youths and their families in their homes, communities and schools.
“We are like probably every other nonprofit not currently in that gold standard of having a randomized, control-group study,” Fazal told JJIE. “In some of our programs, we use specific evidence-based practices even though the YAP model per se has not achieved the gold standard of evidence-based practice with a randomized control group.”
As Fazal wryly noted, no such gold standard exists for the practice of locking up about 160,000 youths a year.
“When people talk about juvenile justice reform and evidence-based practices, we’re very reliant right now on incarceration, which is not at all an evidence-based practice,” she said. “And nobody ever asks for a randomized, control-group study of kids incarcerated, and if they did, they would find that that’s a pretty big barrier.
“That’s one point I like to make because I think it’s worth thinking about. Why is it that we set a standard so high in the community, but we don’t for the harms and dangers of jail?”
YAP, which began in Pennsylvania in 1975, taps into paid advocates “from GEDs to Ph.D.s” who work with youths and their families in the community, Fazal said.
But how can the potential of such generic, community-based programs be evaluated without conducting costly randomized, controlled studies?
A Template to Bridge the Gap?
Mark Lipsey, a research professor at Vanderbilt University in Nashville, Tenn., has devised an experimental template for doing so.
Lipsey’s “Standardized Program Evaluation Protocol” (SPEP) is based on a meta-analysis conducted by him and his colleagues of more than 500 studies done in the past 20 years.
SPEP scrutinizes recidivism reductions resulting from both brand-name programs like MST, FFT and Aggression Replacement Therapy along with generic programs under the umbrellas such as group therapy, family therapy and cognitive-behavioral therapy.
“If we start asking every program to have a randomized study to show that it works, every real-world program out there, there’s no hope that will ever happen,” Lipsey told JJIE.
He pointed to the divide that sometimes emerges between EBP purists and those who advocate generic programs.
“In my mind, these are not oppositional. These are complementary,” Lipsey said. “There are some people who are really tied closely to the brand-name concept who find it oppositional to think that anything else might be legitimately called evidence-based. I don’t see it that way. I see different ways of using the evidence.”
Lipsey’s SPEP is a key component of the Juvenile Justice System Improvement Project at the Center for Juvenile Justice Reform at Georgetown University's McCourt School of Public Policy in Washington.
Shay Bilchik, the founder and director of the Center for Juvenile Justice Reform, praised SPEP, saying: “I think the thing that Dr. Lipsey’s work gives us is the fact that we don’t always need to replace our current program with a Blueprints program … in order to show that in essence you’re delivering what the science tells us is effective practice.
“We can use the Standardized Program Evaluation Protocol that Dr. Lipsey’s developed to take a look at those homegrown programs and see how close they come to the evidence-based, Blueprints-type programs,” added Bilchik, who served as administrator of the federal Office of Juvenile Justice and Delinquency Prevention from 1994 to 2000.
SPEP, Bilchik said, can also help identify programs that may not be performing as well as they could be because of poor implementation.
Echoing the sentiments of other observers, Bilchik said, SPEP can reveal programs that fall short of their potential because of a mismatch between the programs and the populations being served.
And whether they’re EBPs or not, programs work best when they target high-risk youths, when they’re tailored to the specific needs of youths (based on results of validated screening tools), and when they focus on treatment and skill-building rather than punishment and deterrence.
James Bell, executive director of the Oakland-based W. Haywood Burns Institute, which strives to eliminate racial and ethnic disparities in the juvenile justice system, said requiring programs to be EBPs can lead to such disparities.
“So many community-based organizations that had been serving young people – and this is certainly true for young people of color – that weren’t evidence-based practices felt tremendously threatened by this new wave of it has to be evidence-based, and the only way you can be evidence-based … is to have a randomized, controlled study,” said Bell, who also founded the institute, named after the late civil rights leader W. Haywood Burns.
Bell said he also has a moral objection to using control groups of children who receive only placebo treatment and thus don’t benefit from the program being tested.
He said it’s time to seek common ground between brand-name EBPs and generic, community-based programs.
“I believe the way forward is to not keep pitting the developers of evidence-based programs against the community people, and we should quit trying to make the community people actually reach the level of evidence-based practices,” Bell said.
“To me, those are formulas for failure. What we should do is everybody should want the best practices for young people.”
Financial supporters of JJIE may be quoted or mentioned in our stories. They may also be the subjects of our stories.
Visit the Evidence-Based Practices section of the Juvenile Justice Resource Hub: