Tribes, Filter Bubbles, and the Backfire Effect

Why do otherwise rational people, confronted with facts that contradict their deepest presumptions and world view, generally reject those facts rather than revise their world view? Hint: the answer has something to do with storytelling.

The latest US political clown-car adventure into the looking-glass world of “Alternative Facts” may have thrown a spotlight on how stubbornly Trumpaphiles can ignore data, but here’s a disturbing fact: we all think we’re the rational ones, adjusting the story we tell ourselves to the facts and morally reasoning our way to sound conclusions — but behavioural scientists know better.  And for those of us in the business of convincing people to change their minds, it’s surprising we don’t pay more attention to the science of this.

The environmental movement has spent decades spewing facts. Reports. Charts. Infographics. We generate Belgium-sized sets of facts every week, football fields of them a day.  Science facts. Political facts. All of them pointing to the big fact: that we’re exhausting the life support systems of planet Earth. We’ve published sage step-by-step manuals telling industry and government how they can evade disaster. We’ve issued starkly clear individual behaviour change mandates. And still we’re hurtling toward the mythic inferno in the proverbial hand-basket.

It’s because people are distracted by trivial things and you need to make them pay attention!” says Mister Fox, a bipedal figment of my trickster imagination, with a slightly sarcastic smirk and an index finger making an explanation point in the air. “You need to shout more. Put out another report! A longer one! With more graphs! If you can get them to listen, they’ll change their minds. And really, seriously, we both know it’s not the fault of the facts — they are perfectly clear. It’s the fault of those useless hipster “communications experts” drinking 5 Euro lattes and whining on Snapchat.”

Now I know Mister Fox pretty well, and I know when he’s setting me up.

And I also know it’s deeper than that.

Storytelling Training
I’ll be teaching a day-long course in story as theory of change in London in a few weeks. Spaces are limited, you can register here.

I’ve been reading up on filter bubbles, tribal identity, and the Backfire effect. The Backfire effect is nicely document in “Why facts backfire”  and a series of three “You are not so smart” podcasts. (Not editorial comment, that’s the podcast name).

The basics are these. Studies dating back to 2005 have shown that when an individual is confronted with facts that contradict a core belief — say that climate change is a hoax or that a majority of Trump supporters are bigots — those core beliefs will tend to be strengthened rather than weakened. WTF? Yup. In study after study scientists have demonstrated that when it comes to facts and opinions stepping into the boxing ring, facts are on the floor for the ten count at the first punch. We deploy a battalion of psychological magic tricks to keep our worlds consistent and protect our beliefs from challenge. Chief among those tricks, motivated skepticism: the active prodding and poking attack we launch on ideas that don’t match our narratives, as opposed to the happy unexamined acceptance of facts that confirm our beliefs.

When a fact doesn’t match our expectations, we question the source. We suspect cherry-picked data. We ascribe motives. We chip away at that hard cold reality with an icepick until we either no longer believe it or, perversely, accept a fact is false but hold fast to the belief it contradicts. A study in 2014 of parents who were unwilling to vaccinate their children because they believed flu vaccines could cause flu could be persuaded that flu shots don’t cause flu, but their unwillingness to inoculate their children went UP as their brains catalogued and shored up all the other reasons that they thought vaccines were bad.

Consistency is part of the  reason we reject contradictions on our worldview: there’s good solid psychological reasons why we don’t want to change our mind several times a day about the nature of our reality, so we pack our sandcastles hard. We construct a story in our head that we like — one that gerrymanders our behaviour into rough consistency, gives us a basis for predicting how others will behave, and yields persistent confirmations of our guesses and hunches (even if those confirmations are subconsciously cherry-picked). We happily change our minds about details of that sand castle — you can convince me that Tesla, not Edison invented the lightbulb without much effort. But anything that would require us to rebuild that castle? Good luck with that. Our sandcastle is all about who we are. Rebuilding it would mean revisiting our sense of self.

Worse, it would mean revisiting our idea of where we belong. Because deep in our big brains, we’re still Neanderthals and Homo Sapiens sitting around the campfire, eyeing each other with suspicion. We’re afraid that if we accept a fact as false that our friends accept as true, we are no longer one of them. We lose our marshmallow stick at the Liberal/Conservative/Libertarian/Environmentalist/Radical campfire. And way down in our psyches, we know what happens to primates that don’t have a tribe and wander out of territory. Easier to explain away that pernickety fact than to face the wilderness alone.

This is precisely where the filter bubble of only hearing from and speaking to people who agree with our views becomes a calcifying, fossilising danger.  We become increasingly intolerant of non-conventional views. We forget how to talk to people outside our silo. We become cultish, less empathetic and start to view everything — from food choice to Super Bowl preferences to entertainment — through the lens of unexamined, conformist politics. Jonathan Haidt, in The Righteous Mind, demonstrated that we use moral reasoning “not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.”

In some sense, if we want to be effective at changing minds, we have to learn to know our own mind well enough to mistrust it. To really be able to deconstruct the stories we tell ourselves about how the world works and know intimately how we defend that story.  How do we change our own minds without changing our core story?  There are lessons there in those weird corridors and mazes of reasoning, deep lessons, about how to convince others. But they all mean walking — and I mean REALLY walking — in someone elses’ shoes.  That’s an impossible task until we take off our own.

To be continued… Part II of this series will be some thoughts on what we might do to challenge our own filter bubble, how we avoid the backfire effect with stories that go where facts cannot, and how we might separate political identity from triage issues of human survival.

7 thoughts on “Tribes, Filter Bubbles, and the Backfire Effect”

Leave a Reply

Your email address will not be published.