We like it when our beliefs can't be disproven

The joys of unfalsifiability.

We Like It When Our Beliefs Can’t Be Disproven

(AAP)

"Falsifiability" is one of those fancy buzzwords you hear a lot in research settings. Basically, it refers to whether or not a belief can be proven wrong. If I tell you that I weigh 70 pounds, this is a claim that can be easily tested and promptly thrown out by bringing me to a scale — that is, it's falsifiable. If, on the other hand, I tell you that everything in the universe is controlled by an invisible astral monkey with a million arms, then there's little you can do to prove, empirically, that this is a zany notion. Religious beliefs are often unfalsifiable, but they don't have a monopoly on this characteristic. Psychoanalysis, for example, is often held up as containing a lot of unfalsifiable beliefs. Sure, maybe sometimes "a cigar is just a cigar," but how are you going to prove that?
“How am I going to argue with you when you believe a thing that no amount of evidence can prove wrong?”
Referring to a belief as unfalsifiable is a classic way of undercutting it — "How am I going to argue with you when you believe a thing that no amount of evidence can prove wrong?" But an interesting new paper by Justin Friesen of the University of Waterloo and Troy Campbell and Aaron Kay of Duke suggests that unfalsifiability isn't a bug, but rather a feature. It seems that we derive a psychological benefit from believing in things that can't be proven wrong, and that when we're presented with evidence contradicting our opinion on something, we turn to unfalsifiable evidence for comfort.

The paper, published in the Journal of Personality and Social Psychology, consists of four experiments in which Friesen and his colleagues ask people about their beliefs, gauge their reaction to rebuttals, and test the role of unfalsifiable beliefs. In one of them, for example respondents were first asked how religious they consider themselves. Then they were given one of two passages to read about the discovery of the Higgs boson (a particle whose existence was long predicted by theoretical physics) — one that claimed the finding was consistent with religious beliefs and the other claiming it undercut them.

Finally, the participants were asked to rate the importance of ten reasons to believe in God. Some, like "Living a moral life would be impossible without God," were relatively unfalsifiable, while others, like "Scientific evidence demonstrates that God exists," were relatively falsifiable. Religious respondents who were told the Higgs boson presented a threat to religion rated the unfalsifiable beliefs as more important — that is, presented with scientific evidence (albeit of a manufactured-by-researchers variety) that questioned their beliefs, they turned toward statements that can't be proven wrong.

The researchers found a similar effect in the more secular realm of politics. After being asked how much they supported the president, participants were randomly assigned a passage listing five ways to gauge Obama's performance (foreign policy, job creation, and so on). Some participants read a version of the passage that explicitly stated that some of these areas are testable by looking at the numbers, while others (like foreign policy) are not. Then they were asked to rate Obama's performance on the five issues. Among opponents of the president, those who encountered the falsifiability passage rated Obama's performance higher — that is, it appeared that when they were reminded that some of this stuff can be tested, they reined in their views a little bit. (There was no statistically significant effect among supporters of the president.)

Okay, so what does all of this mean? Any study consisting of survey questions can only get us so far in understanding people's real-world political and religious behavior and beliefs, of course. But this paper serves as an important reminder that our beliefs don't arise out of an objective assessment of the facts, and we shouldn't expect evidence to be able to shake those beliefs we hold most closely. Rather, as the authors point out, "when a belief system is serving important psychological needs such as providing meaning or self-worth, it may become risky to subject that belief to rigorous testing."

There's a broken-record aspect to this observation, but we really aren't as rational as we think we are.

This article originally appeared on Science of Us: We Like It When Our Beliefs Can't Be Disproven. © 2014 All Rights Reserved. Distributed by Tribune Content Agency.
scienceofus_black_rgb.jpg

Share
4 min read

Published

By Jesse Signal

Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world
We like it when our beliefs can't be disproven | SBS News