UK Lawmakers Push for Expanded Online Censorship Despite Uncertainty Over Online Censorship Law’s Scope

MPs push for censorship powers they don’t fully understand, treating ambiguity in the law as a green light for control.

CINDY HARPER

At a recent UK parliamentary hearing on social media and algorithms, lawmakers ramped up calls for increased censorship online, despite revealing that they themselves remain unclear on what the existing law, the Online Safety Act, actually covers.

The session, led by the Science, Innovation, and Technology Committee, displayed a growing appetite among MPs to suppress lawful speech based on subjective notions of harm while failing to reconcile fundamental disagreements about the scope of regulatory authority.

Rather than defending open discourse, members of Parliament repeatedly urged regulators to expand their crackdown on speech that has not been deemed illegal. The recurring justification was the nebulous threat of “misinformation,” a term invoked throughout the hearing with little consistency and no legal definition within the current framework.

Labour MP Emily Darlington was among the most vocal proponents of more aggressive action. Citing the Netflix show Adolescence, she suggested that fictional portrayals of misogynistic radicalization warranted real-world censorship.

She pushed Ofcom to treat such content as either illegal or misinformative, regardless of whether the law permits such classifications. When Ofcom’s Director of Online Safety Strategy Delivery, Mark Bunting, explained that the Act does not allow sweeping regulation of misinformation, Darlington pushed back, demanding specific censorship powers that go well beyond the legislation’s intent.

Even more revealing was the contradiction between government officials themselves. While Bunting maintained that Ofcom’s ability to act on misinformation is extremely limited, Baroness Jones of Whitchurch insisted otherwise, claiming it falls under existing regulatory codes.

The discrepancy not only raised concerns about transparency and legal certainty but also highlighted the dangers of granting censorship powers to agencies that can’t even agree on the rules they’re enforcing.

John Edwards, the UK Information Commissioner, shifted the discussion toward algorithmic data use, arguing that manipulation of user data, especially that of children, could constitute harm. While Edwards did not advocate direct censorship, his remarks reinforced the broader push for increased state oversight of online systems, further blurring the line between content moderation and outright control over public discourse.

Committee Chair Chi Onwurah repeatedly voiced dissatisfaction that misinformation is not explicitly addressed by the Online Safety Act, implying that its exclusion rendered the law ineffective.

However, as Bunting explained, the Act does introduce a narrowly defined “false communications” offense, which only applies when falsehoods are sent with the intent to cause significant harm—a standard that is both difficult to prove and intentionally limited to avoid criminalizing protected expression. Onwurah appeared unimpressed by these legal safeguards.

Labour MP Adam Thompson pressed Ofcom to go further, asking why platforms weren’t being forced to de-amplify what he described as “harmful content.” Once again, Bunting noted that Ofcom’s mandate does not include blanket powers to suppress misinformation, and any such expansion would require new legislation. This admission did little to curb the committee’s broader push for more centralized control over online content.

The hearing also ventured into the economics of censorship, with several MPs targeting digital advertising as a driver of “misinformation.” Despite Ofcom’s limited remit in this area, lawmakers pushed for the government to regulate the entire online advertising supply chain. Baroness Jones acknowledged the issue but offered only vague references to ongoing discussions, without proposing any concrete mechanisms or timelines.

Steve Race, another Labour MP, argued that the Southport riots might have been prevented with a fully implemented Online Safety Act, despite no clear evidence that the law would have stopped the spread of controversial, but not illegal, claims. Baroness Jones responded by asserting that the Act could have empowered Ofcom to demand takedowns of illegal content. Yet when pressed on whether the specific false claims about the attacker’s identity would qualify as illegal, she sidestepped the question.

Ofcom’s testimony ultimately confirmed what civil libertarians have long warned: the Act does not require platforms to act against legal content, no matter how upsetting or widely circulated it may be. This hasn’t stopped officials from trying to stretch its interpretation or imply that platforms should go further on their own terms, an approach that invites arbitrary enforcement and regulatory mission creep.

Talitha Rowland of the Department for Science, Innovation, and Technology attempted to reconcile the contradictions by pointing to tech companies’ internal policies, suggesting that platform terms of service might function as a substitute for statutory regulation. But voluntary compliance, directed by unelected regulators under mounting political pressure, is a far cry from a transparent legal framework.

The entire hearing revealed a troubling dynamic: politicians eager to police online speech, regulators unsure of their actual powers, and a legal environment where vague definitions of “harm” are increasingly used to justify censorship by default.

The confusion among lawmakers and regulators alike should raise red flags for anyone concerned with due process, democratic accountability, or the right to express dissenting views in an open society. 


This article (UK Lawmakers Push for Expanded Online Censorship Despite Uncertainty Over Online Censorship Law’s Scope) was created and published by Reclaim the Net and is republished here under “Fair Use” with attribution to the author Cindy Harper

See Related Article Below

UK Lawmakers and Regulators Target End-to-End Encryption and Smaller Messaging Platforms in Final Committee Hearing on Social Media “Harms”

UK MPs target end-to-end encryption as Ofcom, ICO, and DSIT probe Telegram, Signal, and post-Southport riot app use

Abstract grunge background with splattered paint in red, blue, and beige tones, featuring a white speech bubble icon surrounded by a dashed circle at the center.


DIDI RANKOVIC

UK’s Science, Innovation, and Technology Committee’s fourth and final meeting on social media misinformation and harmful algorithms saw renewed attacks against end-to-end encryption, and platforms like Telegram and Signal.

And once again, representatives of the authorities tried to pin the blame for the Southport riots on social media and apps that are outside the scope of what is regulated as Big Tech.

More: UK’s iCloud Encryption Crackdown Explained: Your Questions Answered on Apple’s Decision and How it Affects You

During the session held on April 29, the Committee sought answers about “social media, misinformation and harmful algorithms” from the regulator Ofcom, the Information Commissioner (ICO), and the Department of Science, Innovation and Technology (DSIT).

More: London Mayor Sadiq Khan Cites Southport Stabbing to Justify Targeting Online & “Conspiracy Theories” and “Misinformation”

Labour MP Paul Waugh focused on end-to-end encrypted messengers, specifically the one provided by Facebook, choosing to oddly refer to the secure online technology as “a challenge” that needs to be “combated” – and suggest it is basically a tool of enabling child sex abuse online.

Addressing Ofcom Director of Online Safety Strategy Delivery Mark Bunting, Waugh wanted to know what the regulator was doing “to combat that challenge.”

Bunting replied that encryption has been “identified as one of the areas of risk that companies have to take account of” and called it “a problem.”

And while stating that encryption provides “enormous benefits in terms of privacy and security to users,” and adding that for those reasons, it is “very highly valued by users” (but not so much the authorities?) – Bunting went on to state that, “it does mean that a lot of the tools that we want to see companies use, including the AI (harms) detection tools, aren’t operable in encrypted environments.”

“We think it’s a challenge for the industry. We have been clear that we’re expecting the industry to do more work on techniques that are being developed to detect harmful activity in encrypted environments,” the Ofcom official said, as well as that this was “one of the priority areas of work” for his technology team.

More: UK Gov Pushes Censorship, Blaming Online Content for Southport Killings

Labour MP Emily Darlington used the meeting to go after smaller platforms that are also providing users with end-to-end encryption – and try to forge a link between their use, and the riots that engulfed the UK last summer after the Southport murders of schoolchildren.

Telegram was among those singled out, despite it having over one billion users. Nevertheless, it was “lumped in” with Signal, 4chan, 8chan, etc., all with the goal of connecting the dots Darlington sees between “small apps” and “far-right extremist activity” – the implication being that this is where such activity flies under the regulatory radar, and what Ofcom can enforce.

English politicians will never let us forget that Orwell was their compatriot, so, addressing Mark Bunting, Darlington mentioned that Ofcom has something called, “a small high harms platform task force.”

The response was that this was “a really important area” for Ofcom, while DSIT Director for Security and Online Harm Talitha Rowland said her team was “really concerned” about “small but risky sites (that are) a real danger to UK citizens” – while praising Ofcom’s “small but risky” task force.


This article (UK Lawmakers and Regulators Target End-to-End Encryption and Smaller Messaging Platforms in Final Committee Hearing on Social Media “Harms”) was created and published by Reclaim the Net and is republished here under “Fair Use” with attribution to the author Didi Rankovic

••••

The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)

••••

Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.

••••

Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

••••

Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.

Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of The Liberty Beacon Project.

Be the first to comment

Leave a Reply

Your email address will not be published.


*