<iframe src="//www.googletagmanager.com/ns.html?id=GTM-K3L4M3" height="0" width="0" style="display:none;visibility:hidden">

Leading article

We have more to fear from social media than AI

4 November 2023

9:00 AM

4 November 2023

9:00 AM

For once, Nick Clegg had a point. At the start of this week’s Artificial Intelligence summit at Bletchley Park, our former deputy prime minister spoke about the need to get priorities right. ‘My slight note of caution,’ he said, is that we ‘don’t allow the need to focus on proximate challenges to be crowded out by speculative, sometimes futuristic predictions’. He’s quite right. The most immediate threat is not the prospect of dysfunctional AI but the power wielded by Mark Zuckerberg, Sir Nick’s boss at Meta, and the extent to which the companies it runs, Facebook and Instagram, control the news.

On its own, Facebook has a stunning concentration of power: it’s now the UK’s third biggest news source after BBC1 and ITV, according to Ofcom surveys. More people get their news from Facebook and Instagram than from any newspaper. Algorithms scan the headlines and decide which stories to promote or hide. In this way Zuckerberg is more powerful than Murdoch, Hearst, Rothermere or any media baron. Meta algorithms control the way billions of people see the world: via decisions that are oblique and not fully understood even by the company.

This has kicked off a revolution in the way people see the world, with results that we are only now starting to understand. On page 16, Professor Jonathan Haidt links social media use in teenagers to a rise in anxiety, depression and self-harm. The suicide of a 14-year-old girl from Harrow, Molly Russell, led to a push to bring the Online Safety Bill into law. Her parents blamed Instagram and Pinterest for harming her mental health, and ultimately the coroner found that online content, along with depression, had contributed to her death. The legislation, which was made law last week, threatens social media giants with multimillion pound fines if they show children material that is deemed to be harmful. But the real problem with social media, says Professor Haidt, is the way it absorbs children’s time, keeps them from real interactions and encourages constant self-judgment and negative comparisons. The new law will not change that.

In an attempt to curb any loss of revenue, social media giants have become governments’ censorship partners. This is why a former politician like Nick Clegg has been hired by Zuckerberg. If we can be persuaded that the real risk is ‘disinformation’, Facebook stands to gain even more power.


When The Spectator published a cover gently mocking Joe Biden, and paid to promote it on Facebook, it was rejected by the platform for violating ‘advertising policies’. We never were told why or how or what these standards were. The Spectator has satirised every prime minister since Wellington and every American president since Andrew Jackson. Why was a little gentle satire being rejected? It was an ominous indication of where censorship might lead.

During the pandemic we commissioned Carl Heneghan, the Oxford professor who writes this week’s cover story, to assess the science behind wearing face masks. He found it less than compelling, but his article was – and still is – labelled ‘false information’ by Facebook. It refuses to explain why or what the reasoning is. Facebook dislikes explanation: it prefers power without responsibility. Our Australian Facebook page has just been suspended after publishing a critical history of Hamas.

This is daily life now for publishers in the digital world. James Cleverly, the Foreign Secretary, recently gave an interview for our Chinese Whispers podcast where he cast doubt on Chinese employment figures. We published him saying so on the Chinese-owned TikTok, the world’s no. 1 source for short videos. Mysteriously (and very unusually) the clip garnered zero views for several hours. We were told that it had been ‘stuck in moderation’. A clip highlighting the plight of the Uighurs received the same response.

Does this mean TikTok employs censors? Does Facebook actively try to discourage dissent about lockdown theory? The more mundane truth is that algorithms are simply primed to look for any articles or arguments deemed ‘sensitive’. These are pre-emptively suppressed, just to be safe. This creates a stultifying culture. Publishers try to avoid the algorithms and so they effectively self-censor. In the end it is the truth that suffers.

The Online Safety Bill will make this infinitely worse. The bill could have obliged social media firms to explain their decisions to hide certain articles and reveal the criteria by which they deem some information false. It could have applied meaningful age restrictions to stop children suffering in the way Professor Haidt describes. It did none of these things, and instead what has been created is an environment in which social media giants have more power than ever.

Facebook’s daily wielding of power is what Nick Clegg would call a ‘proximate challenge’, and it is one that ministers have just unwittingly made worse with the Online Safety Bill. If a Labour government pushes through with its threat to impose state regulation of the press, these problems could be exacerbated. These are dark times for freedom of speech.

Got something to add? Join the discussion and comment below.

You might disagree with half of it, but you’ll enjoy reading all of it. Try your first month for free, then just $2 a week for the remainder of your first year.


Comments

Don't miss out

Join the conversation with other Spectator Australia readers. Subscribe to leave a comment.

Already a subscriber? Log in

Close