<iframe src="//www.googletagmanager.com/ns.html?id=GTM-K3L4M3" height="0" width="0" style="display:none;visibility:hidden">

No sacred cows

Is Russell Brand really so dangerous?

8 October 2022

9:00 AM

8 October 2022

9:00 AM

Once the dust has settled over the government’s mini-Budget, another big political battle looms: the Online Safety Bill. This is the legislation that will make Ofcom responsible for regulating the internet so Britain becomes ‘the safest place in the world to go online’ – at least, that’s how the last government tried to sell it. It was due to go the House of Lords for a second reading in July, but was put on hold because of the Tory leadership contest and I was hoping it would never be resuscitated. Liz Truss and her lieutenants are currently going through the last administration’s legislative programme, seeing what they can ditch to free up some parliamentary time for new bills. But the Culture Secretary has said that, like some zombie, this bill will soon be back from the dead.

The problem its opponents face is that those in favour of it are more powerful and better at lobbying than us. For instance, five ex-culture secretaries recently wrote an article for the Daily Telegraph headlined: ‘Watering down Online Safety Bill “will put children at risk”.’ This is a reference to the fact that the bill will empower Ofcom to impose large fines on social media platforms such as Twitter, Facebook and YouTube if they fail to remove content that’s harmful to children. A case in point is Molly Russell, the 14-year-old who took her own life in 2017. Last week, a coroner concluded that the material Molly had seen on social media that appeared to encourage suicide had ‘contributed to her death in a more than minimal way’.

This verdict was met with a chorus of voices clamouring for the bill to be passed as quickly as possible, including that of Prince William, who said that protecting young people from harm should be ‘a prerequisite, not an afterthought’ for social media companies. But this argument is a straw man. As far as I’m aware, none of the bill’s opponents object to holding the Silicon Valley giants to account for the death of children like Molly. Indeed, as a father of four teenagers, I want the big social media companies to remove material that endangers children, like the images of nooses, razor blades and sleeping pills that Molly was pointed towards by algorithms on Instagram and Pinterest in the weeks before her death.


My issue with the bill is that it doesn’t just aim to protect children from disturbing material, but grown-ups as well, including content that’s ‘legal but harmful’. We still don’t know exactly what legal material the proponents of the bill think adults need to be protected from – this Index Librorum Prohibitorum isn’t included in the bill itself, but will be set out in a separate statutory instrument, with future governments able to add to this inventory – an ominous hostage to fortune. However, in July the government did publish an ‘indicative list’ of content it would like social media companies to ‘address’, including ‘some health and vaccine misinformation’.

The problem with trying to incorporate nebulous concepts like ‘misinformation’ into law is that they will inevitably be abused by political activists and defenders of official orthodoxy to silence their opponents. For instance, last week YouTube removed a video made by the comedian Russell Brand on the grounds that it contained ‘harmful misinformation’ about the virus. His sin was to say that the National Institutes of Health had approved the drug Ivermectin as a treatment for Covid-19, when, in fact, it hasn’t. It was an innocent mistake – the NIH had approved the drug’s use in clinical trials, but not more widely – and Brand quickly corrected it. Nevertheless, the video was censored.

Was that fair? As Brand pointed out, YouTube hasn’t removed any videos of people claiming the Covid-19 vaccines are 100 per cent effective against infection – that’s not ‘harmful misinformation’, apparently, more of a noble lie. So what guarantee do we have that it won’t just be content that challenges the prevailing consensus that’s classed as ‘misinformation’ after the Online Safety Bill is passed and YouTube becomes an even more zealous enforcer of health and vaccine orthodoxy? We should put laws in place to prevent social media companies censoring legitimate discussion and debate in the name of protecting people from ‘misinformation’, not laws encouraging them to ramp it up.

So I’m going to be spending the next few weeks trying to persuade the government to turn this piece of legislation into the Children’s Online Safety Bill. By all means, let’s have a bill that makes the internet safer for teenagers. But adults should be able to judge for themselves just how dangerous it is to watch videos by Russell Brand.

Got something to add? Join the discussion and comment below.

You might disagree with half of it, but you’ll enjoy reading all of it. Try your first month for free, then just $2 a week for the remainder of your first year.


Comments

Don't miss out

Join the conversation with other Spectator Australia readers. Subscribe to leave a comment.

Already a subscriber? Log in

Close