Tuesday, April 30, 2013 |
This month had more than its fair share of bad news. In the days after the news of the Boston bombing came reports of packages allegedly tainted with poison that were intercepted before reaching their intended target, President Barack Obama. Then there was a deadly explosion at a fertilizer plant near Waco, Texas -- a tragedy but, by all accounts so far, an accident. It was the kind of news week that left everyone looking for answers -- but answers are hard to come by. Even serious news outlets like CNN can get swept away by hearsay and gossip -- indeed, CNN reported that a bombing suspect was in custody, when that was simply not true. Adding fuel to the news cycle fire were conspiracy theorists who insisted that the Boston bombing and the Texas explosion were inside jobs, even in the face of contradictory evidence.
Science journalist Michael Shermer believes that humans are hardwired to find meaning in chaos, even when there is truly no meaning to be found. He's the author of The Believing Brain and the editor of Skeptic magazine and he spoke with Day 6 host Brent Bambury about why our biased brains are so attracted to conspiracy theories.
Shermer outlined several types of biases that lead our brain to jump to wrong conclusions. Patternicity, for example, is the tendency to make patterns out of random noise. "The brain is just not designed to see randomness. We look for -- and find -- meaningful patterns [and] any kind of connecting the dots will do," he said. "Even if it's totally random, and even if they're not really connected, it's not really possible to see non-connectedness."
So amid the chaos of an unfolding story like the Boston bombings, Shermer said, this sort of "patternicity" fits perfectly.
But why do our brains do this? In his book, Shermer argues that the propensity to find patterns evolved as a survival mechanism. "It's called 'association learning.' You associate A with B, and often A really is connected to B," he said. "So if there's a correlation, we tend to find a causation with it. More often than not in the natural environment of our evolutionary ancestors, those correlations were really connected. Even though you're not always right, the false positives you get usually are not deadly."
In other words, if you think someone's following you, it's probably a good idea to run. "Because there's no harm in assuming something bad, there's no selection against making 'type 1' errors, or false positives," said Shermer. "Thinking the rustle in the grass is a predator and it turns out it's just the wind -- that won't take you out of the gene pool. All it does is make you more cautious and suspicious, just in case it's real."
On the other hand, type 2 errors (that is, underestimating a potential danger) can take you out of the gene pool.
"In other words, the brain evolved the default option to just assume everything you see is real," said Shermer.