Turning a spotlight on stop and search

About Twitter Instagram Facebook Donate
12.12.2025

Suspicion by design: How AI and section 60 are reshaping everyday policing

Both algorithmic and section 60 policing have well-documented racial biases and abuses. Now, LFR adds another layer, argues Tom Dixon

Earlier this year, StopWatch readers were introduced to the troubling, dystopian rise of suspicionless policing powers, notably section 60 stop and search and its impact on our civil liberties through Privatising the algorithm: When predictive policing moves beyond the state. Since then, recent policing practices have begun to erode even the PACE standard. With current algorithmic tools in place, from risk-scoring to LFRs (Live Facial Recognition software), these are being treated as a sort of de facto ground for police action, applying opaque, veiled rules instead of individual suspicion. This can be coined as 'black-box predictions'.

AI policing toolkits

Police forces across England and Wales are rolling out their new toolkit for tech surveillance devices used against the public. The most prominent and worrying is LFR. The Metropolitan police (MPS) and South Wales police (SWP) have previously been using NEC’s 'NeoFace' system, matching camera feed against police watchlists. 2025 saw this spread, as the Home Office agreed a £20 million pound framework contract (with BlueLight Commercial) for a nationwide LFR deployment to suppliers, to which included NEC. By mid-2025, there were 10 LFR vans funded by our government, conducting searches against over 150 million images drawn from passport or immigration databases, across 7 forces (London’s Met and regional forces, such as Greater Manchester, West Yorkshire, Bedfordshire, Surrey, Sussex, Hampshire & Isle of Wight). The Greater Manchester police stressed that those who are non-matches are deleted off their system, however, its recent reprimand from the Information Commissioner’s Office (ICO) for disappearing CCTV footage asks whether trust can be affirmed to their data safeguards.

Following, the Met is in the process of setting up a 'Precise Policing Phase 2' (launched in September this year), a £25 million pound procurement to accelerate AI policing products to their forces.

Many of these systems come from private cross-border companies contracted with our government (NEC being Japanese). US-based Palantir, a name slowly reaching our households, is also in the mix. Escaping mainstream media attention, its ‘Nectar’ platform links police databases across England and Wales, targeting people the system deems is ‘about to commit a criminal offence’, outsourcing our police force to privatised algorithmic decisions.

Legal risks and oversight gaps

As we approach 2026, this convergence of AI and section 60 continues to dig a deep accountability vacuum, to which traditional safeguards simply cannot apply. PACE requires forces must justify and record each search. The Independent Office for Police Conduct (IOPC) can investigate such abuses as it prides itself on setting 'the standards by which police should handle complaints'. But what we can tell you is that none of these mechanisms govern private algorithms, with no statutory duty governing audits or directions on how these systems should work…

Police chiefs have raised concerns, admitting that they have no control and 'no clear guidelines' on uses of facial recognition. This leaves an echo chamber where neither parliament nor our own police force have vendors to hold accountable or crucially require vendors to publish police error rates or racial bias tests. Today, a loophole has now opened. StopWatch warns that our police force now 'point to AI "risk maps" rather than individual grounds for suspicion'. Even as the Home Office pushes national LFR implementation, key posts for scrutiny and accountability remain forgotten and left in the dark (such as the Biometrics Commissioner role).

Concerns from bodies such as the ICO warn that facial recognition use 'does not operate in a legal vacuum'. The Equality and Human Rights Commission equally cautioned in August 2025 that the Met’s LFR usage defines itself 'likely to be unlawful', urging that tighter restrictions and safeguards are a must. The question lingers, what will the future hold for public accountability and safety?

Racial disproportionality and indirect discrimination

As our previous article Privatising the algorithm: When predictive policing moves beyond the state has shown, both algorithmic and section 60 policing have well-documented racial biases and abuses. Now, LFR adds another layer. We can see in recent practice that deployments continue to target minority neighbourhoods. The Met’s October 2025 use of LFR in London boroughs like Croydon and neighbourhood crime hotspots will inevitably cast a wider net over people of colour.

Predictive policing feeds a vicious circle, using predictive mapping to direct police back to already overpoliced areas. When these are now taken together, AI will proceed to support a new form of indirect discrimination. A facial recognition 'match' (something that has already held itself to be unreliable) or high risk score will entrench this cycle. Black and other marginalised communities will be more heavily scanned, stopped and searched, not because of new evidence of criminal wrongdoing but because historic geographical policing patterns label them as 'risky'. A self-fulfilling prophecy of 'successful' surveillance.

A trajectory toward pre-emptive policing?

Looking at what the future holds, there is no doubt that this current trend will only accelerate. In mid-2025 the Home Office doubled its LFR fleet, new face cameras in town centres or at sporting events will arrive armed with the latest AI profiling scanners, and the police and Palantir data link promise will hurdle the UK into a model of privatised 'pre-emptive' policing. Civil society campaigners and rights groups including StopWatch, Liberty, Big Brother Watch have called for proper laws to catch up in this accelerating society. In practice, this identifies itself as repealing or reforming section 60, along with introducing binding transparency, auditing and anti-bias standards.

Without this reform, our country risks itself falling into and cementing a two-tier justice system. Where opaque, backdoor algorithmic code governs our life by wielding suspicion by design. Where will it stop?


By Thomas Dixon, StopWatch volunteer

All blogposts are published with the permission of the author. The views expressed are solely the author’s own and do not necessarily represent the views of StopWatch UK.

More articles

import_4_facialrecognition.jpg

Support our work

Any amount we receive helps to support us in our mission and keeps us independent

Donate

Sign up to our newsletter

For regular updates on our activities and to learn how you can get involved with us

Subscribe