Tribes, Filter Bubbles, and the Backfire Effect

Why do oth­er­wise ratio­nal peo­ple, con­front­ed with facts that con­tra­dict their deep­est pre­sump­tions and world view, gen­er­al­ly reject those facts rather than revise their world view? Hint: the answer has some­thing to do with sto­ry­telling.

The lat­est US polit­i­cal clown-car adven­ture into the look­ing-glass world of “Alter­na­tive Facts” may have thrown a spot­light on how stub­born­ly Trumpa­philes can ignore data, but here’s a dis­turbing fact: we all think we’re the ratio­nal ones, adjust­ing the sto­ry we tell our­selves to the facts and moral­ly rea­son­ing our way to sound con­clu­sions — but behav­ioural sci­en­tists know bet­ter.  And for those of us in the busi­ness of con­vinc­ing peo­ple to change their minds, it’s sur­pris­ing we don’t pay more atten­tion to the sci­ence of this.

The envi­ron­men­tal move­ment has spent decades spew­ing facts. Reports. Charts. Info­graph­ics. We gen­er­ate Bel­gium-sized sets of facts every week, foot­ball fields of them a day.  Sci­ence facts. Polit­i­cal facts. All of them point­ing to the big fact: that we’re exhaust­ing the life sup­port sys­tems of plan­et Earth. We’ve pub­lished sage step-by-step man­u­als telling indus­try and gov­ern­ment how they can evade dis­as­ter. We’ve issued stark­ly clear indi­vid­u­al behav­iour change man­dates. And still we’re hurtling toward the mythic infer­no in the prover­bial hand-bas­ket.

It’s because peo­ple are dis­tract­ed by triv­ial things and you need to make them pay atten­tion!” says Mis­ter Fox, a bipedal fig­ment of my trick­ster imag­i­na­tion, with a slight­ly sar­cas­tic smirk and an index fin­ger mak­ing an expla­na­tion point in the air. “You need to shout more. Put out anoth­er report! A longer one! With more graphs! If you can get them to lis­ten, they’ll change their minds. And real­ly, seri­ous­ly, we both know it’s not the fault of the facts — they are per­fect­ly clear. It’s the fault of those use­less hip­ster “com­mu­ni­ca­tions experts” drink­ing 5 Euro lat­tes and whin­ing on Snapchat.”

Now I know Mis­ter Fox pret­ty well, and I know when he’s set­ting me up.

And I also know it’s deep­er than that.

Storytelling Training
I’ll be teach­ing a day-long course in sto­ry as the­o­ry of change in Lon­don in a few weeks. Spaces are lim­it­ed, you can reg­is­ter here.

I’ve been read­ing up on fil­ter bub­bles, trib­al iden­ti­ty, and the Back­fire effect. The Back­fire effect is nice­ly doc­u­ment in “Why facts back­fire”  and a series of three “You are not so smart” pod­casts. (Not edi­to­ri­al com­ment, that’s the pod­cast name).

The basics are the­se. Stud­ies dat­ing back to 2005 have shown that when an indi­vid­u­al is con­front­ed with facts that con­tra­dict a core belief — say that cli­mate change is a hoax or that a major­i­ty of Trump sup­port­ers are big­ots — those core beliefs will tend to be strength­ened rather than weak­ened. WTF? Yup. In study after study sci­en­tists have demon­strat­ed that when it comes to facts and opin­ions step­ping into the box­ing ring, facts are on the floor for the ten count at the first punch. We deploy a bat­tal­ion of psy­cho­log­i­cal mag­ic tricks to keep our worlds con­sis­tent and pro­tect our beliefs from chal­lenge. Chief among those tricks, moti­vat­ed skep­ti­cism: the active prod­ding and pok­ing attack we launch on ideas that don’t match our nar­ra­tives, as opposed to the hap­py unex­am­ined accep­tance of facts that con­firm our beliefs.

When a fact doesn’t match our expec­ta­tions, we ques­tion the source. We sus­pect cher­ry-picked data. We ascribe motives. We chip away at that hard cold real­i­ty with an icepick until we either no longer believe it or, per­verse­ly, accept a fact is false but hold fast to the belief it con­tra­dicts. A study in 2014 of par­ents who were unwill­ing to vac­ci­nate their chil­dren because they believed flu vac­ci­nes could cause flu could be per­suad­ed that flu shots don’t cause flu, but their unwill­ing­ness to inoc­u­late their chil­dren went UP as their brains cat­a­logued and shored up all the oth­er rea­sons that they thought vac­ci­nes were bad.

Con­sis­ten­cy is part of the  rea­son we reject con­tra­dic­tions on our world­view: there’s good solid psy­cho­log­i­cal rea­sons why we don’t want to change our mind sev­er­al times a day about the nature of our real­i­ty, so we pack our sand­castles hard. We con­struct a sto­ry in our head that we like — one that ger­ry­man­ders our behav­iour into rough con­sis­ten­cy, gives us a basis for pre­dict­ing how oth­ers will behave, and yields per­sis­tent con­fir­ma­tions of our guess­es and hunch­es (even if those con­fir­ma­tions are sub­con­scious­ly cher­ry-picked). We hap­pi­ly change our minds about details of that sand castle — you can con­vince me that Tes­la, not Edis­on invent­ed the light­bulb with­out much effort. But any­thing that would require us to rebuild that castle? Good luck with that. Our sand­castle is all about who we are. Rebuild­ing it would mean revis­it­ing our sense of self.

Worse, it would mean revis­it­ing our idea of where we belong. Because deep in our big brains, we’re still Nean­derthals and Homo Sapi­ens sit­ting around the camp­fire, eye­ing each oth­er with sus­pi­cion. We’re afraid that if we accept a fact as false that our friends accept as true, we are no longer one of them. We lose our marsh­mal­low stick at the Liberal/Conservative/Libertarian/Environmentalist/Radical camp­fire. And way down in our psy­ches, we know what hap­pens to pri­mates that don’t have a tribe and wan­der out of ter­ri­to­ry. Eas­ier to explain away that per­nick­ety fact than to face the wilder­ness alone.

This is pre­cise­ly where the fil­ter bub­ble of only hear­ing from and speak­ing to peo­ple who agree with our views becomes a cal­ci­fy­ing, fos­sil­is­ing dan­ger.  We become increas­ing­ly intol­er­ant of non-con­ven­tion­al views. We for­get how to talk to peo­ple out­side our silo. We become cultish, less empa­thet­ic and start to view every­thing — from food choice to Super Bowl pref­er­ences to enter­tain­ment — through the lens of unex­am­ined, con­formist pol­i­tics. Jonathan Haidt, in The Right­eous Mind, demon­strat­ed that we use moral rea­son­ing “not to recon­struct the actu­al rea­sons why we our­selves came to a judg­ment; we rea­son to find the best pos­si­ble rea­sons why some­body else ought to join us in our judg­ment.”

In some sense, if we want to be effec­tive at chang­ing minds, we have to learn to know our own mind well enough to mis­trust it. To real­ly be able to decon­struct the sto­ries we tell our­selves about how the world works and know inti­mate­ly how we defend that sto­ry.  How do we change our own minds with­out chang­ing our core sto­ry?  There are lessons there in those weird cor­ri­dors and mazes of rea­son­ing, deep lessons, about how to con­vince oth­ers. But they all mean walk­ing — and I mean REALLY walk­ing — in some­one elses’ shoes.  That’s an impos­si­ble task until we take off our own.


To be con­tin­ued… Part II of this series will be some thoughts on what we might do to chal­lenge our own fil­ter bub­ble, how we avoid the back­fire effect with sto­ries that go where facts can­not, and how we might sep­a­rate polit­i­cal iden­ti­ty from triage issues of human sur­vival.

7 thoughts on “Tribes, Filter Bubbles, and the Backfire Effect”

Leave a Reply

Your email address will not be published.