Parents don’t know what they don’t know, and nobody is rushing to tell them. School crime statistics overall are underreported and unreliable.
Student Victimization in U.S. Schools: Results For the 2007 School Crime Supplement to the National Crime Victimization Survey was released this week. It didn’t take long for Tweets to pop up on Twitter announcing the report’s citations on students who were victims of bullying.
The report highlighted “findings” which were so obvious, one would have to ask why the federal government would even ask such questions and perhaps more importantly, why they would think it would be some major revelation to readers:
- “The percentage of student victims of violent crimes who reported being afraid of attack or harm at school (23.2 percent) was higher than that of nonvictims (4.9 percent) (figure 5 and table 7)”
- “A higher percentage of students reporting any crime avoided specific places at school because of fear of attack or harm than did nonvictims (13.1 percent vs. 5.4 percent) (figuure 5 and table 7)”
Well, duh…no kidding? And you spent how much federal tax money to come to that conclusion?
Doesn’t common sense tell us that students who are victims of crimes at school would be more fearful of attack or harm than students who have not been victmized? Wouldn’t one expect this to be an obvious and normal reaction? And wouldn’t you expect them to avoid specific places because of fear of attack if they have previously been victimized?
The U.S. Department of Education’s school crime statistics are, by-and-large, a joke. They have been for many years. They typically reflect outdated data (this report is for 2007, published in mid-2010) and are based upon a limited number of academic research surveys instead of actual incident-based data and/or law enforcement data.
Why? The answer is simple: There is no federal mandatory school crime reporting and tracking for K-12 schools. The federal government’s response, while presumably well-intended, is based on a hodgepodge collection of academic research studies. The real story is often told in the fine print but unfortunately, the media and others who cite the report either rarely read the fine print or simply don’t care.
At the end of the second paragraph on page iii, the “Highlights” page, the report states:
- “Readers should be alerted to the limitations of the survey design and the analytical approach used here with regard to causality. Conclusions about causality between school or student characteristics and victimization cannot be made due to the cross-sectional, nonexperimental design of the SCS.”
So in short an educator or school safety professional can conclude nothing about causality because of how the data was collected and the project design. So why bother doing the project and report?
Dig deeper into pages 2-3 and on page A-5, and you’ll find numerous data limitations and disclaimers. While the researchers appear to have made a valiant effort, the report contains so many limitations and disclaimers that it creates the perception that one should take this report, and the numbers therein, with a grain of salt. I think they’re correct.
We already know school crimes are often underreported to state departments of education and to law enforcement.
Lessons learned: Base your school district’s policy and funding decisions based upon local school and public safety. Include incident-based, law enforcement data in addition to surveys. And if you’re going to do something, do it right — which includes the media, who should also be reading the fine print disclaimers.
Parents Beware! School crime and discipline data is sometimes not worth the paper (or web site) upon which it is written. Start by reading the fine print first. It may save you the time of reading the rest of the report.
_______________________
Republished with permission from Ken Trump, a national consultant, speaker, author and expert on K-12 school safety and security, emergency preparedness and crisis planning. He writes regularly at http://www.schoolsecurityblog.com