The Current

'Evolution didn't work on truth, it worked on survival': A psychologist explains why we cling to our beliefs

People will find a way to defend their beliefs even when faced with contradictory evidence, says psychologist James Alcock. He talks to Anna Maria Tremonti about why we believe what we believe, and how evolution played a role.

People will defend belief despite contradictory evidence, says James Alcock

Psychologist James Alcock explores why we defend some beliefs so forcefully in his book Belief: What It Means to Believe and Why Our Convictions Are So Compelling.

Read Story Transcript

A stubbornness to cling to our beliefs has roots in how we evolved as a species, according to a professor of psychology.

"People are willing to believe the most egregious nonsense and hold that belief with considerable conviction," said James Alcock, a professor at  York University, and author of Belief: What It Means to Believe and Why Our Convictions Are So Compelling.

But, he explained, that may come down to a survival instinct.

In a special edition, The Current looks at the state of truth in our world today: from how our belief systems work; to who suffers most when the truth gets sidelined; to the technologies being deployed to undermine reality — and potentially protect it.

Alcock spoke with Anna Maria Tremonti about why we believe what we believe. Here is part of their conversation.

There's that old saying seeing is believing. And yet two people can look at the same thing, and come away with a different opinion, even if we have cameras rolling and they can look at it again and again. How is that possible?

This is an interesting concept psychologically because perception — we tend to think of it as like a video camera. It's out there looking at the world, we're getting in reality. But perceptions are actually constructions.

Our brains generate perception. And that perception, most of the time will be the same for two different people, but it depends a great deal on our experiences.

So for instance, if you were to look through a microscope at some kind of specimen, if you're a microbiologist you will see something quite different from someone who doesn't have that training, because your brain can't organise it in the same way.

To back down in our society is generally seen as a weakness ... very often people are pushed to defend beliefs, just to avoid losing.- James Alcock

And so if you have people looking at some scene and they've had different experiences, or they've got different attitudes, then they're going to perceive things differently. And more importantly, the next day they're going to remember them even more differently, because always we bring our memories into line with past experience and our expectations.

So we've evolved this ability to use memory, and use perception to navigate the world, survive. But evolution didn't work on truth, it worked on survival. And whether... you can take for example an animal that tastes a food and gets sick. Most animals will never eat anything with that taste again, even though that food may not have been the cause of the sickness. But it's better from an evolutionary point of view, to avoid that food if there are lots of other foods around. And that's not truth, but it works.

Its survival.


That is so interesting that you make that distinction, of evolution and truth versus survival. Why do some people continue to believe things that are verifiable facts?

We're really adept at defending beliefs that are important to us, even in the face of contrary evidence. So take something for example — and this is I find very concerning — there are a lot of people around the world who are refusing to give their children the MMR vaccine, measles, mumps, rubella, because they believe it causes autism. And it's very clear this belief is based on fraudulent science, and the paper that made this connection has been withdrawn.

Andrew Wakefield with his wife Carmel at the General Medical Council in London, U.K., Jan., 2010. Wakefield lost his licence after the GMC ruled that he had acted "dishonestly and irresponsibly" in carrying out his research into vaccines. (Peter Macdiarmid/Getty Images)

The doctor who did it lost his licence.

Exactly but there are still all sorts of people who believe it. And if you challenge them with what I've just said, they'll say: 'Well, that's Big Pharma, they're trying to cover it up.'

They'll find some way to support the belief ... and we're all concerned about children, right. If you have children and they're vulnerable and you want to protect them, and you make this decision not to have that vaccine. Probably some of your neighbours say: 'Well [they] should be having it,' you've got to defend it. And the more you defend it the more capable you become of almost ignoring all evidence.

Is that because of fear, you fear what might happen if you believe something to be true and you're wrong, does fear play a role in how people hang on to something that is verifiably false?

At times, yes, but I think more importantly we all learn, from earliest on, that losing an argument is a mark against you in some way.

To back down in our society is generally seen as a weakness. Now I don't want to exaggerate that too much. If the argument isn't important, backing down isn't important. But if the argument is important, then we feel as though we have been bested by giving in. And so very often people are pushed to defend beliefs, just to avoid losing.

More importantly though, if the belief is important to begin with, there are other good reasons for not wanting to give it up, right? And take for example religious beliefs. If you believe in God, if you're a devout Roman Catholic say, is it possible that someone in a discussion, an argument is going to lead you to the point where you say: 'Gosh, my whole life has been wrong,' — it's unlikely. The belief is so important that even if the logic were overwhelming, you'll find some way to defend your belief.

Click 'listen' near the top of this page to hear the full conversation.

Written by Padraig Moran. Produced by Howard Goldenthal and Peter Mitton. Q & A has been edited for clarity and length.