This email newsletter has been cross–posted from Antonym. You can subscribe here if you want to see how the experiment goes.
Welcome to the second of my newsletters. This week seems to be about pulling together three strands of reading and thinking: polarised politics, conspiracy theories and social media platforms. Let me know what you think. Last week someone told me they liked the funny TikTok video, and there’s nothing like that this week, which suggests I may not be very good at responding to feedback.
Changing your mind is hard, never mind anyone else’s. Because of various cognitive biases, it’s hard to understand how someone from the other side of an argument can even hold their point of view. Data suggest that we think people having an opinion that is the polar opposite of our own are uniformly less intelligent and informed than we are.
That’s the hopeful description of Ali Goldsworthy’s forthcoming book, Poles Apart. Ali was the Deputy Chair of the Liberal Party in the UK and is now CEO of the Depolarization Project.
I was lucky enough to hear a talk this week by Ali about how polarization is so powerful and why it works by people with the best intentions.
Polarization sounds like a bad thing, right? Funnily enough, not always. Forming groups and consensus is beneficial in society and politics. But there needs to be flux and flow and change or divisions can become toxic and unhelpful. The unintended consequences of creating a movement are that a group set on change or challenging wrongs is real and sometimes very harmful.
The phenomenon of polarization is complicated, but it’s clear that the more a belief becomes part of someone’s identity, the harder it is for them to change their mind, even when faced with evidence.
Lots of good stuff in these links (and one to pre-order Ali’s book).
- The Depolarization Project
- Ali’s essay on a Radio 4 podcast
- Poles Apart – forthcoming book on polarisation from Ali Goldsworthy, Laura Osborne and Alex Clark.
- Change My Mind – The Depolarization project’s podcast.
The surprising fragility of conspiracy theories
What about when the opposing point of view to your own really is a dangerous fantasy? Conspiracy theory-driven movements like QAnon are literally armed and dangerous in the US and just plain scary in the UK.
A paper from UCLA last year on the structure of the stories that make up conspiracy theories used data from Reddit and other platforms to understand how conspiracy theories form and found that they are surprisingly fragile, according to an article from Ars Technica.
The narrative frameworks around conspiracy theories typically build up and stabilize fairly quickly, compared to factual conspiracies, which often take years to emerge, according to Tangherlini. Pizzagate stabilized within one month of the Wikileaks dump and remained relatively consistent for the next three years.”
[…] They found that conspiracy theories tend to form around certain narrative threads that connect various characters, places, and things across discrete domains of interaction that are otherwise not aligned. It’s a fragile construct: cut one of those crucial threads, and the story loses cohesiveness, and hence its viral power. This is not true of a factual conspiracy, which typically can hold up even if certain elements of the story are removed.
Conspiracy emerge from conversations that connect different bits of information. Jokes, misunderstandings, exaggerations and lies mix then sometimes a story appears that catches.
A fascinating aspect of the paper’s findings is how long it takes for a conspiracy narrative to become stable, for people to be repeating more or less the same story and spreading it. For Pizzagate – look it up if you don’t know it – this process took just one month.
How quickly these things take root reminded me of some analysis in a New York Times The Daily podcast on how people organised the attack on Congress online. The paper’s cybersecurity correspondent Sheela Frankel followed the journey from an angry Facebook group started the day after the presidential election through the Capitol’s violence on January 6th.
It was extremely well-organized. The day after the election, a group immediately pops up on Facebook called Stop the Steal. It initially builds on this base of kind of Tea Party activists and QAnon supporters and otherwise long-term members of MAGA — the Make America Great Again term that Trump likes to use. And they come together and they start collecting what they see as evidence of voter fraud.
Frankel says the Facebook group was up for two days, long enough for half a million people to find one another and start organising. When the social network shut them down, the group and their narrative were already strong enough to migrate to a collection of other websites and platforms. A movement – a collection of Trump supporters, QAnon followers, religious extremists, and white supremacists of various stripes – formed and built a story and a loose organisation that carried out an insurrection.
The insurrection will be monetised.
To a liberal mind, it seems that the solution should be discussion and debate. Facts will drive out untruths, and reason will win the day. What may be emerging is an understanding that conspiracy theories and misinformation are systems issues.
Rather than deploring departures from reality by conspiracy theory movements, we would serve society better by spotting and disrupting them early on. Like fire-watching in a forest or airline safety, it may be carried out by others but should be regulated and not left solely to private interests.
We’ve waited too long for big tech to do the necessary – but relatively simple – job of moderation adequately. The more complex work of spotting and putting out conspiracy theory fires before they spread needs attention and should not be left to those who have profited hugely from having demagoguery and division.
Talking of monetisation, it turns out some of the violent insurrectionists in Washington on January 6th were earning money even as politicians fled from them and people were injured and killed. One man alone made at least $2,000 from his live stream video on a platform called D-live.
Governments may be the ones that have to provide this misinformation emergency service.
At a personal scale, Bellingcat? used the Twitter account of the woman who was shot dead by Capitol police to tell the story of her radicalisation.
In terms of her views, Ashli Babbitt probably didn’t stand out from the crowd massed at the U.S. Capitol. And that is precisely why the story of her political awakening—well told through her activity on Twitter—is so instructive in understanding what brought that crowd together.
Over the past five years, a potent MAGA online subculture appears to have transformed this former Obama voter, who turned to Trump over a dislike of Hillary Clinton, into a QAnon follower ready to storm the Capitol. In a Twitter exchange on November 15 2018, Babbitt said that she had voted for President Obama, calling him “our president” and saying that he had done “great things.Bellingcat? | The Journey of Ashley Babbit
This past week was a bit of a struggle for a social media user like myself and that description of Twitter resonated. The news is too terrifying and compelling to ignore, but I have deleted Twitter and Facebook from my phone a couple of times this week and tried to find healthier things to do with my poor beleaguered mind in the meantime.
May you be calm, safe and well this week. Thank you for reading.