
Questioning Covid and Climate Change is “Seditious” Says Britain’s New Chief Censor

LAURIE WASTELL
After the riots, the censors.
For all that we now know about Axel Rudakubana’s horrific crimes, it remains the decided view in officialdom that the countrywide disorder that followed them last summer was principally caused by social media – or rather, by people speaking their minds on social media. As I have written here before, such a view formed an essential part of the political narrative put out by Sir Keir Starmer and Yvette Cooper about the riots – repeated, for instance, by judges from the bench; in Pravda-like Department for Education guidance on how to talk to schoolchildren about what happened; and widely in the media. But it is now also – surprise, surprise – the official view of Ofcom, the broadcast regulator, and all its attendant online-censorship enthusiasts.
With Ofcom’s newly minted powers in the 2023 Online Safety Act (OSA), we can therefore expect the spectre of Southport to be the fuel behind a fierce new push for internet censorship. The Blairite quango has already been gearing up. In July, Ofcom already had a total of 466 employees dedicated to ‘online safety’ – more than three times the 123 officials employed under the EU’s equivalent Digital Services Act – and it is set to add nearly a hundred more by next month. There are now just two months to go until Ofcom convenes its statutory Advisory Committee on Misinformation and Disinformation (ACMD) as part of the OSA, comprising “persons with expertise in the prevention and handling of disinformation and misinformation online” who will advise the UK Government on how best to scour the internet for wrongthink. Four decades after 1984, thanks to this Conservative Party legislation, Airstrip One is about to get its very own Minitrue.
The membership of that committee remains shrouded in mystery, but we do know who will be chairing it: Lord Richard Allan, a former Liberal Democrat MP, now a non-affiliated Life Peer, with a career in communications and technology, most notably at Facebook in the 2010s.
So, what kind of Big Brother will Lord Allan prove to be? It bodes ill that among Allan’s prior postings is an executive directorship at the European Digital Media Observatory (EDMO), an EU-funded fact checker. This supposedly impartial organisation helped to draw up the EU’s broad and censorious Code of Practice on Disinformation, which governs the DSA. In its fact-checks, it names dissenting political speech such as questioning Net Zero policies or the hashtag ‘Ireland is full’ under its rubric of “disinformation trends” said to cause “harm”.
We might consider some of his telling past comments in a May 2020 interview given to the Reuters Institute for the Study of Journalism. Perhaps because he’s a poacher-turned-gamekeeper, he doesn’t come across quite as ardent as your typical Marianna Spring (presenting social media as a “cesspit” of disinformation and “hate” and blithely cheering on censorship). He notes, for instance, that from a “fundamental democratic point of view”, it should not be for social-media platforms to contradict the choices of voters by deplatforming politicians for “harmful” speech. He also seems to understand that fact-checkers ought not to stray into the realm of opinion, since differences over an “editorial line” are not something they can plausibly resolve.
Nevertheless, he clearly shares the typical fashionable prejudices of others in his political milieu. Asked how platforms should treat “Right-wing people who claim COVID-19 is much less deadly than people say” (by nefariously publishing, erm, scientific studies which show this), he castigates this as an example of “selective use of information”, a sin of lockdown sceptics also committed by Net Zero sceptics. Speaking of people who “support the view that the virus is less deadly” or who “don’t like the idea of governments acting on climate change” – and who, heaven forbid, use “selective” scientific evidence to back those views up – he says: “They are expressing a point of view which I describe as seditious.” Blimey.
To be clear, in the interview, Allan manages to fold this view into a reason for not censoring such dissenting views, since clearly anyone espousing such dastardly views is not involved in a “rational discussion where we’ll just present you with the facts and then we’ll fix the problem”. Well that’s nice of him. Still, such susceptibility to lockdown cheerleading and Net Zero apologism hardly inspires confidence that he will expend much social capital defending the rights of people dissenting against establishment shibboleths in his role as chair.
Worse, consider the types of people Ofcom and Allan’s disinformation committee will have in its ear, something we can see from the ongoing Science, Innovation and Technology Committee post-Southport inquiry into “Social media, misinformation and the role of algorithms”.
Even the inquiry’s framing, as the Free Speech Union noted in its own written submission, “[appears] to presuppose that concepts such as ‘misinformation’ and ‘algorithms’ are causing a degree of harm far exceeding what is empirically demonstrated, and imply an imperative for intensified regulatory or legislative action”. Indeed, the FSU adds, the weight given to online speech is approaching something of a “moral panic”.
Far from questioning the censorious framing of the inquiry, however, others in their submitted testimony have pushed for it to go further. Among these are Full Fact, which argues that OSA powers to go after merely “illegal” content are not enough, and the Centre for Countering Digital Hate (CCDH), which urges policymakers to implement its very own framework for a “safe and accountable” internet.
The CCDH in particular looks set to be exercising a highly outsize influence on the direction of online regulation policy. It has close links with No 10, being founded by Morgan McSweeney, now the PM’s Chief of Staff. And its CEO, Imran Ahmed, was one of the few witnesses invited to testify to the inquiry last month, along with BBC “disinformation reporter” Marianna Spring.
In service of his crusade for censorship, Mr Ahmed paints a positively lurid picture of the typical social media experience. Apparently the Southport unrest was “predictable and unsurprising” given the unfettered nature of social media, he says, and caused by “acute disinformation”. He warns that social media platforms encourage the “churning of fringe violative content into the mainstream” in their desire for engagement. Worse, those nefarious algorithms are not designed for “truth”, but rather to keep their dim-witted users hooked and reacting. “If you went on a social media platform and found the truth immediately, you would go off and do something else.” And there was me thinking X was a news aggregator with a pretty robust Community Notes feature from which one can log off at any time. (In fact, the responses of X, Meta and Google to the inquiry indicate that each did take significant steps to remove or deboost content during the Southport unrest – no level of content regulation, it seems, is ever enough for the CCDH.)
Strident as the CCDH is, it is also clearly highly influential. So much so that in the wake of the Southport disorder last summer it succeeded in gathering numerous ‘stakeholders’ together in order to push for beefing up the OSA.
The FSU notes:
In August, the Centre for Countering Digital Hate (CCDH) hosted a closed-door meeting under the Chatham House rule to discuss the role of social media in civil unrest. The meeting included officials from DSIT, the Home Office, Ofcom and other organisations. CCDH’s subsequent policy recommendations included amending the Online Safety Act to enable the Secretary of State for DSIT to grant Ofcom additional “emergency response” powers to fight “misinformation” that poses a “threat” to “national security” and “the health or safety of the public”.
More detail on this EU-style crisis-response mechanism:
CCDH’s proposal would involve amending the section 175 “special circumstances” directive created by the Online Safety Act to enable [Technology Minister] Peter Kyle, to issue a “directive” to Ofcom to ramp up its censorship powers if the Government feels there is a threat to national security or to the health and safety of the public (both, notably, constituting exemptions under Article 10(2) of the ECHR which empower states in certain circumstances to curtail the liberties of their citizens).
While this has yet to be implemented, all the stars are aligning such that it soon could be. Should something like the Southport riots happen next summer, for instance, we will have a statutory ACMD tasked with overseeing heavy policing of online speech to quell unrest. When this inevitably doesn’t work – because, as it turns out, people are angry at the Government for lots of longstanding reasons, not just because of things they read on the internet – there will be calls for increasing online regulation still further. With Labour embarrassed and looking for someone to blame, and the likes of the CCDH already loud in its ears, it’s hardly a reach to imagine the Lord Allan’s “independent” ACMD calling for the CCDH’s proposed crisis powers itself. Internet censorship in Britain looks set to get a whole lot worse.
This article (Questioning Covid and Climate Change is “Seditious” Says Britain’s New Chief Censor) was created and published by Daily Sceptic and is republished here under “Fair Use” with attribution to the author Laurie Wastell
••••
The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)
••••
Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.
••••
Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.
••••
Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.
Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of The Liberty Beacon Project.
Leave a Reply