We just sit there like lemmings and suck it up.
The principle reason for this is that we have become intellectually lazy and lack critical thinking skills.
How news organizations, including this one, unintentionally misinformed the public on guns.
While we wait there are major airports being built-in every major city in America being prepared for take offs and landing of jet-powered pigs.
The Dallas Morning News
June 28, 2017
Steve Doud, a subscriber from Plano, emailed me to say he’d read something in the June 21 Dallas Morning News that couldn’t possibly be true.
An eight-paragraph Washington Post article on page 10A reported on a national study about kids and guns.
The last sentence said 4.2 percent of American kids have witnessed a shooting in the past year.
“Really?” Doud wrote. “Does it really sound believable that one kid out of every 24 has witnessed a shooting in the last year? I think not, unless it was on TV, in a movie, or in a video game. In that case it would probably be more like 100 percent.”
His instincts were right. The statistic was not.
Here is the unfortunate story of how a couple of teams of researchers and a whole bunch of news organizations, including this one, unintentionally but thoroughly misinformed the public.
It all started in 2015 when University of New Hampshire sociology professor David Finkelhor and two colleagues published a study called “Prevalence of Childhood Exposure to Violence, Crime, and Abuse.”
They gathered data by conducting phone interviews with parents and kids around the country.
The Finkelhor study included a table showing the percentage of kids “witnessing or having indirect exposure” to different kinds of violence in the past year.
The figure under “exposure to shooting” was 4 percent.
Those words — exposure to shooting — are going to become a problem in just a minute.
Earlier this month, researchers from the CDC and the University of Texas published a nationwide study of gun violence in the journal Pediatrics.
They reported that, on average, 7,100 children under 18 were shot each year from 2012 to 2014, and that about 1,300 a year died.
No one has questioned those stats.
The CDC-UT researchers also quoted the “exposure to shooting” statistic from the Finkelhor study, changing the wording, and, for some reason, the stat, just slightly:
“Recent evidence from the National Survey of Children’s Exposure to Violence indicates that 4.2 percent of children aged 0 to 17 in the United States have witnessed a shooting in the past year.”
The Washington Post wrote a piece about the CDC-UT study.
Why not? Fascinating stuff! The story included the line about all those kids witnessing shootings.
The Dallas Morning News picked up a version of the Washington Post story.
According to Finkelhor, the actual question the researchers asked was, “At any time in (your child’s/your) life, (was your child/were you) in any place in real life where (he/she/you) could see or hear people being shot, bombs going off, or street riots?”
So the question was about much more than just shootings.
But you never would have known from looking at the table.
Finkelhor said he understood why “exposure to shooting” might have misled the CDC-UT researchers even though his team provided the underlying question in the appendices.
Linda Dahlberg, a CDC violence prevention researcher and co-author of the study featured in The Post and this newspaper, said her team didn’t notice anything indicating the statistic covered other things.
See the entire article below.
Then again, the Finkelhor study didn’t say anything about kids “witnessing” shootings; that wording was added by the CDC-UT team. Dahlberg said she’ll ask Pediatrics about running a correction.
All of this matters because scientific studies — and the way journalists report on them — can affect public opinion and ultimately public policy. The idea that one in 25 kids witnessed a shooting in the past year was reported around the world, and some of the world probably believed it.
No matter where you stand on guns or any other issue, we ought to be making decisions based on good information.
Finkelhor’s team caused confusion by mislabeling a complicated stat. The CDC-UT researchers should have found the information suspect. The Washington Post should have asked more questions about that line from the CDC-UT study.
And we should have been as skeptical of the Washington Post report as Steve Doud was.
Mike Wilson is the editor of The Dallas Morning News. Email: email@example.com