We all like to think of ourselves as rational beings. Trouble is, this immediately sets up a paradox. If we are rational, we should be swayed by convincing evidence. And convincing evidence shows that human beings are very often not rational at all – we play the lottery, we are superstitious, and we ‘go with our gut’.
That’s why the beautiful concept of ‘rational thought’ will nearly always be a goal we can only strive to achieve. I have to remind myself of this every time I think ‘why would anyone sensible believe in homeopathy/conspiracy theories/creationism/hell/delete as appropriate’. It’s easy to tell yourself that these people must be stupid and irrational. But that’s not what the evidence really shows.
A friend pointed me to this great article entitled ‘How convenient! The Epistemic Rationale of Self-validating Belief Systems‘, by Maarten Boudry and Johan Braeckman. The title could be translated to, ‘why people believe crazy stuff’. And the short answer seems to be, because that’s the way our brains work.
A good example is superstitious beliefs, which I once heard beautifully illustrated by Professor Bruce Hood. If someone offered you an exact replica of your wedding ring, would you take it? Of course not. Because it wouldn’t be your ‘real’ ring. But actually, this feeling is totally irrational. Your ring isn’t imbued with the memories of your relationship. That’s your brain. The ring is just a piece of metal. But it really doesn’t feel that way, and you’d be hard pushed to rationalise away those superstitious scruples.
We can’t really help these superstitions. If your lucky pair of pants has been linked with past success, you are almost irresistibly drawn to the conclusion that your pants might – just might – have magical powers. You tell yourself you don’t really believe this. But you wear them for that job interview anyway. Just in case. Our brains are truly amazing at spotting patterns, even where there’s no logical, rational reason for them.
Another factor is our ability to conveniently sidestep evidence that conflicts with our beliefs. I think horoscopes are a great example of this. We all remember reading a few uncannily accurate predictions, but no-one really remembers the endless parade of vague and unhelpful ones. As the article explains, this pattern of behaviour may be more than simply ignoring inconvenient evidence.
For example, one study showed that defenders and opposers of capital punishment were more likely to find problems in research that conflicted with their point of view than in research that supported it. I think we all know this feeling – if I saw a headline tomorrow saying ‘climate change disproved’, I’d immediately think, ‘what a load of rubbish!’
This behaviour shows that most people aren’t comfortable with ignoring evidence against their beliefs. Instead, they are keen to rationalise it away, so they can reassure themselves that they are rational thinkers. The article gives creationists as an example – if they’re so irrational, why do they work so hard to explain away scientific evidence?
This can be explained by what’s called ‘cognitive dissonance’. Being presented with evidence that conflicts with a strongly-held view results in cognitive dissonance. In other words, if there is something at stake, like your reputation or view of yourself, you will be more likely to cling to your belief in the face of evidence against it. This makes me think of someone in a black and white movie saying, ‘I can’t believe it! I won’t believe it!’
The article gives a rather frightening example:
‘A classic illustration of cognitive dissonance can be found in the landmark study by Leon Festinger and his colleagues, who infiltrated a doomsday cult and observed the behavior of the followers when the prophesized end of the world failed to come true. The followers who had resigned from their jobs, given away their material belongings and were present at the arranged place and time with full conviction in their imminent salvation, became even more ardent believers after the prophecy failed, and started to proselytize even more actively for the cult. However, those for whom the cognitive stakes were lower (e.g. those who kept their belongings and stayed home in fearful expectation of what was supposedly to come), were more likely to abandon their beliefs afterwards.’
Interestingly, following this idea to its logical conclusion suggests that intelligent people would actually be better able to rationalise away inconvenient evidence, and so probably more likely to persist in weird beliefs. A sobering thought for anyone who’s told themselves ‘I’m not stupid enough to believe something like that’.
The article also gives a fascinating list of defence mechanisms that help ‘crazy’ beliefs to persist – for example, vague predictions that are almost certain to come true – ‘I sense you’ve lost something…’ – or the ability to justify almost any evidence after the fact – ‘I’m cured! The homeopathy worked!’ or ‘I’m still ill – I must not have taken enough’. These defences aren’t necessarily consciously created by believers – in fact it may be more likely that belief systems can only survive if they happen to have some resilience in the face of conflicting evidence. Survival of the fittest again!
As the authors of the article put it, ‘this invulnerability of belief systems may in part explain their unabated popularity. All other things being equal, belief systems that allow the believer to remain outside the reach of refutations, or that provide some convenient ways of coping with difficulties, will be more likely to be selected among competing beliefs and belief systems, and more likely to be disseminated.’
They conclude that ‘our susceptibility to self-validating belief systems is a function of several aspects of the way our human “belief engine” works: its inclination towards confirmation bias, its proficiency at rationalization and ad hoc reasoning, its valuation of an appearance of objectivity, and its motivation for cognitive dissonance reduction.’
In other works, none of us are immune to crazy beliefs. And once we’ve latched onto one, we are unlikely to let go. To finish with another paradox, rationality may not protect you from irrational beliefs.