Illustration:Dionne GainCredit:
Throughout history,humans have repeatedly rejected facts and evidence in favour of belief systems that serve their emotional and/or political needs.
During the Black Death in the 14th century,conspiracy theories were rampant across Europe. The blaming ofJews,witches and heretics was common,whereas the culprits were bacteria transmitted mainly byfleas and rodents. In the 1880s,there were fears ofEuropean elites overthrowing American democracy and accusations thatChinese immigrants spread disease. “Us versus them” explanations were fantasies that fed societal anxieties. Perversely,they also comforted the anxious.
During the COVID pandemic,distrust of medical authorities took over Facebook’s newsfeeds. Researchers found that a group of just 12 people labelled the “Disinformation Dozen” became responsible for 65 per cent of the anti-vaccine content that went viral across Facebook,TikTok and Twitter,now X. For too many people,it was simpler to label vaccines and lockdowns as conspiracies than to grapple with the reality of uncertainty,political missteps,the race to understand the virus and the necessarily flawed nature of evolving science. When their freedom of movement was at stake,a fact could be a dangerous thing. “Doing your own research” became a euphemism for going down a rabbit hole of piecemealing YouTube videos,blog posts and opinions on social media platforms.
Meta launched fact-checking in2016 in response to criticism that Facebook helped to spread misinformation during the US presidential campaign – the one that led to Trump’s first presidency.
Loading
Most people I know who published a post that triggered the “false information” flag ignored it,and viewed it with hostility,as an ideological weapon.Pew Research Centre research in 2019 found that almost half of Americans,and most Republicans,believed fact-checkers were biased. And now that is Zuckerberg’s claim:his paid fact-checkers are biased and can no longer be trusted. So he’ll do something akin to Elon Musk’s “community notes” system on X. That turns all users into experts. It hands over the job of challenging falsehoods and misinformation to unpaid social media users,a ragtag army of volunteer moderators. It also hands this responsibility for giving asecond opinion to language models (that is,the artificial intelligence systems that platforms such as ChatGPT are trained on).
Some of the human volunteers will be great,no doubt. Many will love a good fact,or the challenge of hunting one down. Others won’t know a fact if they trip over it. And volunteer moderators will bring to the task their own views and prejudices,while AI has inherent biases drawn from thedatasets and developers they are powered by. Theystruggle to validate source information. One bias will challenge another. This will compound the existing problem of confirmation bias. It will send more social media users to the comfort of their echo chambers,especially when it comes to political campaigns,medicine and science.