British Police Test AI System to Profile Individuals Using Sensitive Data From 80 Sources

A powerful new police tool quietly turns everyday data into full-spectrum portraits of people’s lives.

Close-up of a blue eye overlaid with digital data elements and the British flag, symbolizing cybersecurity or technology in the UK.
.
KEN MACON
.
.British police forces have begun acquiring AI software from a US tech company that merges sensitive personal data, such as race, health, political views, religious beliefs, sexuality, and union membership, into a unified intelligence platform.
..
A leaked internal memo from Bedfordshire Police obtained through freedom of information, reveals plans to roll out the “Nectar” system beyond its pilot stage.

Developed in partnership with Palantir Technologies, Nectar draws together approximately 80 data streams, from traffic cameras to intelligence files, into a single platform. Its stated aim is to generate in-depth profiles of suspects and to support investigations involving victims, witnesses, and vulnerable groups, including minors.

The 34-page briefing highlights police leadership hoping to extend the software’s deployment from Bedfordshire and the Eastern Region Serious Organised Crime Unit to a national scale, Liberty reported. It asserts the system could enhance crime prevention efforts and protect at-risk individuals more effectively.

Official Data Protection Impact Assessment (DPIA) document for Palantir Foundry Platform (Nectar) Beds force, detailing the project's goal to help multiple police units and eventually apply it nationally to protect vulnerable people by preventing, detecting, and investigating crime; it lists special category data used such as race, political opinions, religion, genetic data, sexual orientation, philosophical beliefs, ethnic origin, sex life, trade union membership, biometric data, and health; data subjects involved include persons suspected or convicted of criminal offences, victims, witnesses, children or vulnerable individuals, and employees.

This move forms part of a broader governmental initiative to apply artificial intelligence across public services, including health and defense, often via private sector partnerships such as this.

However, the deployment of Nectar, which accesses eleven “special category” data types, has raised alarms among privacy advocates and some lawmakers. These categories include race, sexual orientation, political opinions, and trade union membership.

While Palantir and Bedfordshire Police emphasize that Nectar only utilizes information already held within existing law enforcement databases and remains inaccessible to non-Police personnel, concerns are mounting. There are worries about potential misuse, such as data retention without proper deletion processes, and the risk that innocent individuals could be flagged by algorithms designed to identify criminal networks.

Checklist showing selected options for special category data to be used in the proposal, including Race, Ethnic origin, Political opinions, Sex life, Religion, Trade union membership, Genetic Data, Biometric Data, Sexual orientation, and Health, with Philosophical beliefs and None not selected.

Former Shadow Home Secretary David Davis voiced alarm to the I Magazine, calling for parliamentary scrutiny and warning that “zero oversight” might lead to the police “appropriating the powers they want.”

Liberty and other campaigners have also questioned whether Nectar effectively constitutes a mass surveillance tool, capable of assembling detailed “360-degree” profiles on individuals.

In response, a Bedfordshire Police spokesperson stated the initiative is an “explorative exercise” focused on lawfully sourced, securely handled data.

They argue the system accelerates case processing and supports interventions in abuse or exploitation, especially among children. Palantir added that within the first eight days of deployment, Nectar helped identify over 120 young people potentially at risk and facilitated the application of Clare’s Law notifications.

Palantir, which built Nectar using its Foundry data platform, insists its software does not introduce predictive policing or racial profiling and does not add data beyond what police already collect. The firm maintains that its role is confined to data organization, not decision-making.

Still, experts express deep unease.

Although national rollout has not yet been authorized, the Home Office confirms that results from the pilot will inform future decisions. With private-sector AI tools embedded more deeply into policing, questions about oversight, transparency, data deletion, and individual rights loom ever larger.


This article (British Police Test AI System to Profile Individuals Using Sensitive Data From 80 Sources) was created and published by Reclaim the Net and is republished here under “Fair Use” with attribution to the author Ken Macon

••••

The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)

••••

Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.

••••

Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

••••

Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.

Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of The Liberty Beacon Project.

Be the first to comment

Leave a Reply

Your email address will not be published.


*