Britain’s AI Surveillance State:
A Grave Assault on Liberty and Free Speech
TOM ARMSTRONG
British police plan to use artificial intelligence to predict crimes before they happen. The head of the College of Policing, that body guilty of pushing non-crime hate incidents on us, says forces across the country are running about 100 AI projects. None have any public agreement or input. They say it will cut paperwork. Perhaps, though it is rare for a bureaucracy to do so without replacing it with other make-work flannel. But the main aim is to spot ‘future offenders’. The government has set aside £4 million for an AI map of Britain by 2030 that flags likely crime spots (something old fashioned bobbies on the beat knew through experience). Police say that could stop fights or knife crimes early. They also say they might target men who pose risks to women and girls, with AI they say they could find them first. That is, before they commit any crime.
Home Secretary Shabana Mahmood is backing all this. Of course she is. And she is also backing wider use of facial recognition cameras, again, with no public debate or acceptance. Sinisterly, she wants AI to enable the state to watch everyone, all the time. Speaking at the Tony Blair Institute – where else? – she said: “When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon. That is that the eyes of the state can be on you at all times.” This vision, to any rational person, indicates a looming totalitarian dystopia. Bentham’s Panopticon was a prison where guards saw inmates without being seen. Inmates behaved because of the constant fear of being watched. Mahmood wants this for all Britain, for you and me, for all of us. All the time. She adds: “I think there’s big space here for being able to harness the power of AI and tech to get ahead of the criminals, frankly, which is what we’re trying to do.” But this targets citizens, not just criminals. It flips innocence until proven guilty and crushes privacy and free speech.
Predictive policing scans past crime data for patterns. It marks hot spots or risky people, or people who might become risky in the future. Of course, that could mean more patrols in flagged areas or checks on named individuals. Combined with facial recognition, it builds a control net. Police know where crimes might occur and who might commit them. Cameras track them. But predictive AI errs, like predictive text. West Midlands Police banned Israeli fans from a match based on Microsoft’s Copilot AI. It made up a violent event that never happened. Chief Constable Craig Guildford retired, suffering no consequences. If AI lies about fans, what about real threats?
Mahmood’s words chill. “The eyes of the state can be on you at all times” echoing Orwell’s Big Brother. But she says it plainly, enthusiastically, obviously seeing the Panopticon as a blueprint for how the State control citizens. People self-censor under constant gaze. Free speech dies. Dissent hides. Recent events show that the State targets dissent before crime. Two-tier policing favours some groups. In Harehills, Leeds, riots erupted after social services took a Roma child. Police fled as rioters flipped cars and burned buses. No firm response, no arrests. In Birmingham, police met “community leaders” after a lie about a far-right event by a notorious far-Left activist and then let the community – you know who they were – “police themselves.” A Muslim mob attacked a pub; innocent white men were hurt. No arrests. White groups, on the other hand, face harsh treatment. Mythical ‘far-Right’ protesters after Southport murders saw quick arrests. But pro-Palestine marches with jihad chants go unchecked. Police have even said “jihad” has many meanings.
Not unnaturally therefore, many think that AI predictive eyes will focus on dissenting voices, not real criminals. It’s hard to see it targeting black youths or Islamists preaching jihad, isn’t it? Data reflects past biases. AI amplifies them. Amnesty, naturally, says UK predictive policing is racist and picks on the poor. But most will see it reinforcing the Establishment’s obvious anti-white racism and fear of the white working class. Predictive policing comes down to them saying you are guilty before you have done anything, violating common sense, rights like a fair trial and will undoubtedly lead to profiling those likely to dissent from the tyrannical woke agenda.
It clashes with Magna Carta and Common Law. Magna Carta says that no one is to face imprisonment except by lawful judgment of peers or the law of the land. Predictive policing sets presumption of innocence until proven guilty not only on its head, but aside completely – you will be guilty if the police AI program says you are guilty – a hideous version of the ‘computer says no’ horror we are all too familiar with, where the computer has the last say and nobody has the ability to gainsay it. Computer says ‘guilty!’ Common law requires that guilt needs proof beyond reasonable doubt. Predictive policing presumes guilt on data guesses and computer modelling and punishes before acts occur. Will they jail people “just in case”? Or send folk to re-education camps?
Conservative MP David Davis compares Mahmood’s plans with the Minority Report, Steven Spielberg’s dystopian thriller in which “precogs” working for the “Precrime” unit predict murders before they happen, allowing arrests for crimes not yet committed, and warns that it was dystopian sci-fi, “If an AI system deems you to be at risk of committing a crime, how do you go about proving the AI is wrong?” Obviously, you can’t. Free speech will suffer, and no doubt that is part of the plan. Constant surveillance makes people avoid protests, with dissent labelled a threat. UK government targets online speech. It arrests for posts causing “anxiety.” Over 12,000 arrests in 2023 for online messages. AI adds to this. Post a comment saying mass immigration and multiculturalism have caused significant harm to our society? AI super cop marks you down as a potential terrorist. Fanciful? Ask Tommy Robinson. Globalists groups are gagging for all this. Tony Blair advises. His institute hosts Mahmood. WEF talks AI for governance. UN Agenda 2030 wants data cities, and government-controlled AI expanding into all areas of our lives.
But in the US, PredPol hit minority areas hard. China uses AI for social credit and to punishes dissent. Britain risks going down the same, sinister, totalitarian path. We have to fight this. We still have tools to do so, and to have predictive AI policing banned. We must demand a focus on real policing: catch thieves, burglars. This is becoming critical now and it’s time to start asking politicians asking for our vote what they will do to stop and reverse this move to tyranny.
This article (Britain’s AI Surveillance State) was created and published by Free Speech Backlash and is republished here under “Fair Use” with attribution to the author Tom Armstrong





Leave a Reply