Britain’s online safety law running amok
British users of the X social-media platform got a shock last month when they discovered they were not able to read certain posts. Instead, they were confronted with the following message: “Due to local laws, we are temporarily restricting access to this content until X estimates your age.”
The local law in question is the Online Safety Act, a piece of legislation designed to make Britain the safest place in the world to be online. At least, that’s how it was sold when introduced in 2018 by Matt Hancock, the Conservative digital secretary at the time. After much debate in Parliament, the Act was eventually passed in 2023, but some of its clauses are only just being activated, including a legal duty imposed on social-media companies to protect children from harmful content.
That would not have been particularly controversial had it just meant online porn and sites promoting suicide and self-harm. That’s the type of content service providers are expected to protect children from in more than 20 American states, including Texas. But the word “harm” is not carefully defined in the Online Safety Act, and the penalty for failing to discharge this duty — as meted out by Ofcom, a powerful state regulator — is a fine of up to £18 million (about $24.4 million) or 10 per cent of annual global turnover, whichever is higher.
Not surprisingly, then, platforms such as X have been erring on the side of caution. The content it has age-restricted so far includes a Conservative Member of Parliament’s speech about the rape gang scandal involving predominantly men of Pakistani heritage preying on young teenage girls; video footage of a man being arrested at an anti-immigration protest in Leeds; and posts from a thread about Richard the Lionheart.
In some ways, this is the least problematic part of the Online Safety Act. Under the new regulatory regime, service providers are to be placed in different categories, with the highest — Category 1 — attracting the most onerous safety duties. One of those duties is to provide Ofcom with up-to-date assessments in which you assure the regulator that you have empowered users to block anonymously created content, should they so desire. That is a problem for Wikipedia, which promises its contributors anonymity. Its parent company mounted a legal challenge in May, questioning the criteria Ofcom is using to categorise different providers. That lawsuit was unsuccessful, which means Wikipedia may have no choice but to geo-block British users — or, at the very least, limit their access.
That is not an idle threat. Gab.com, a US-based right-wing social-media network, recently blocked people in Britain from access after receiving what the site called “yet another demand from the UK’s speech police”, ie, Ofcom. British users now see the following message when clicking on Gab’s URL: “The latest e-mail from Ofcom ordered us to disclose information about our users and operations. We know where this leads: compelled censorship and British citizens thrown in jail for ‘hate speech’. We refuse to comply with this tyranny.”
Believe it or not, the Online Safety Act as originally drafted was even more censorial. It contained a clause empowering a senior government minister to draw up a list of content that was “legal but harmful” and then direct Ofcom to prioritise its removal. Various pro-free speech lobby groups, including the one I run, pushed back, pointing out the provision was at odds with a fundamental principle of English common law, which is that unless something is explicitly prohibited, it is permitted. This clause seemed to introduce a sinister grey area in which something was both lawful and prohibited. The then-Conservative government conceded the point and removed the offending section, but British prime minister Sir Keir Starmer, the Labour leader, is under pressure to bring it back.
Last month, a powerful parliamentary committee recommended amending the Act to grant Ofcom the power to insist on the removal of “legal but harmful content”, attributing social unrest that broke out in Britain last summer to “misinformation” algorithmically amplified on social media. The disorder followed the murder of three schoolgirls, which some social-media users wrongly blamed on an undocumented migrant.
Under the Labour government, which is more authoritarian that its Conservative predecessors, free speech campaigners have given up on trying to improve the Online Safety Act and are focusing on stopping it from getting worse. Ironically, our best hope may be the Trump Administration. Vice-president JD Vance has expressed misgivings about the erosion of free speech in Britain, lately his holiday destination. Let’s hope Starmer is listening, not least because any expansion of Ofcom’s powers enabling it to fine US tech companies for failing to remove lawful content might jeopardise a British-US trade agreement.
• Toby Young, a member of Britain’s House of Lords, is the founder and director of the Free Speech Union