David Ropeik, an author and risk-perception consultant, writing in the uber-cool Undark, says in 2011, the city leaders of Calgary, Alberta, bowed to public pressure and ended fluoridation of the local drinking water, despite clear evidence that the benefits of fluoridation vastly outweigh its risks. A recent study found that second graders in Calgary now have 3.8 more cavities, on average, than a similar group did back in 2004-05, when the water was still being treated.
In West Virginia, legislators in favor of shrinking government recently passed a law allowing sale of unpasteurized milk, despite convincing evidence that raw milk is a vector for pathogens like Salmonella, E. coli, and Listeria. To celebrate, the bill’s sponsor shared some raw milk with his colleagues, several of whom got sick. The legislator says it was just coincidence.
Since the 2011 nuclear accident in Fukushima, Japan, fear of radiation has prompted thyroid cancer screening for all children in the prefecture. The levels of radiation to which kids had been exposed were too low to pose significant danger, and the sensitive ultrasound screening technique is well known to find abnormal cells in most people’s thyroids, though in nearly all cases those cells will never cause cancer. As a result of this unprecedented scrutiny for an infinitesimal risk, hundreds of kids have had their thyroids removed unnecessarily, with far-reaching health implications for the rest of their lives.
Our perceptions of risk are products of cognitive processes that operate outside our conscious control — running facts through the filters of our feelings.
A Canadian couple is mourning the death of their 19-month-old son from meningitis. They hadn’t vaccinated him, and treated him with natural remedies like horseradish root and olive leaf extract, refusing medical attention until the boy was unconscious and near death. They are facing criminal charges.
For anyone outside the emotions that produced these choices, it’s hard not to feel frustration at hearing about them. It’s hard not to call them ignorant, selfish, and irrational, or to label such behavior, as some do — often with more than a hint of derision — “science denialism.” It’s hard, but it’s necessary, because treating such decision-making as merely flawed thinking that can be rectified with cold hard reason flies in the face of compelling evidence to the contrary.
In fact, the evidence is clear that we sometimes can’t help making such mistakes. Our perceptions, of risk or anything else, are products of cognitive processes that operate outside our conscious control — running facts through the filters of our feelings and producing subjective judgments that disregard the evidence. The behavioral scientists Melissa Finucane and Paul Slovic call this the Affect Heuristic; it gives rise to what I call the risk perception gap, the dangers produced when we worry more than the evidence says we need to, or less than the evidence says we should. This is literally built in to the wiring and chemistry of the brain. Our apparent irrationality is as innate as the functioning of our DNA or our cells.
Bill Leiss and I called it a risk communication vacuum in our 1997 book, Mad Cows and Mother’s Milk.
Whatever it’s called, people do irrational things, against the reason of others.
Certainly I do.
And it drives people crazy.
Maybe it’s brain wiring – certainly the fad in addiction, perception and mindfulness research – but it all sounds like a way to make a buck.
And that’s fine, everyone needs a salary.
Facts are never enough, empathy is often lacking, story-telling is key, but these are just observations, the blunt force of armchair critics. Creators create, and get involved in the frontlines.
Put the words into actions.
Walk on.