Earlier in this series, StopWatch readers were shown how suspicionless policing powers and privatised algorithmic tools are reshaping everyday law enforcement. This article takes that argument further, examining how the government’s facial recognition consultation, racial bias evidence, and rapid infrastructure expansion together reveal a troubling shift: biometric stop and search is being normalised first, and regulated later.
Not whether. How?
The government is sending mixed messages on facial recognition. On one hand, ministers have launched a public consultation promising 'safeguards' on police use of face-scanning technology.
On the other, they are simultaneously greenlighting the largest-ever expansion of live facial recognition (LFR) policing a £115 million plan over the next three years. To create a National Police AI Hub and deploy 40 new facial recognition vans across England and Wales following the Home Office’s recently published white paper (taking the total availability from the existing 10 up to 50!). With the Home Office heralding this the 'most significant modernisation of policing in nearly 200 years'. The contradiction is glaring. While the public is being asked how (or if) facial recognition should be used, the authorities are barrelling ahead to embed it into everyday policing.
That is the real story here. This is not a neutral conversation about future possibilities. It is the managed public staging of a direction of travel that has already been chosen. The consultation itself, announced by ministerial written statement on 4 December 2025 and open until 12 February 2026, openly admits that police currently rely on a 'patchwork of common law, data protection, and human rights legislation.' It also proposes a new oversight body to set rules and check compliance. But now that the consultation window has closed, the public debate has effectively passed while the machinery of expansion remains in motion. The government is not asking whether biometric stop and search should be normalised. It is asking how to legalise what it is already scaling.
The January 2026 white paper simply extends that logic from pilot fleet to national model. What used to be framed as targeted, and exceptional is now becoming routine equipment. Suspicion is shifting from the officer on the street to the systems around them: the van, the watchlist, the algorithm, the database.
The Home Office’s own racial bias problem
The Home Office has already measured bias in facial recognition. The National Physical Laboratory (NPL) tested the Cognitec FaceVACS-DBScan algorithm used for Police National Database facial searches. At a threshold where the overall false positive identification rate (FPIR) is 2.7%, the FPIR is 0.04% for White subjects, 4.0% for Asian subjects, 5.5% for Black subjects, and 9.9% for Black women. More false alerts mean more wrongful stops, more searches, and more coercive encounters.
Those errors land in a policing landscape already defined by suspicionless power, where live facial recognition is already operating at scale. The Home Office says 13 forces have used or are using LFR, and deployments in London from January 2024 to September 2025 led to over 1,300 arrests. The Met’s annual report records 203 deployments and an estimated 3,147,436 faces passing through camera zones in a year. It also reports that 8 of its 10 false alerts involved people recorded as Black and that 6 false alerts led to police engagement.
The official response has been to manage the problem, not pause expansion. The Home Office says it has procured a new algorithm for police database searches, tested by NPL, usable at settings with no statistically significant bias, and will test it operationally in early 2026. Yet the same government is funding a five-fold increase in the national LFR van fleet. Without enforceable anti-bias standards and mandatory public demographic outcome reporting. Leaving biometric stop and searches to widen the same disproportionality it claims to fix.
The consultation theatre
By the time the consultation closed, the real decisions had already been made. The consultation promises clearer rules, new oversight, and more transparency. But it also explicitly says the new framework should 'support technological development' and make police use of these tools easier to justify and more future-proof. This is the new legal patchwork: not a pause for democratic consent, but a framework designed to stabilise and legitimise powers already being expanded.
Recent events make that even harder to ignore. On 24 February 2026, the Biometrics and Surveillance Camera Commissioner published his response to the consultation and warned that 'self-regulation and voluntary standards cannot be relied upon to safeguard human rights.' He argued that serious uses of facial recognition should face tighter legal restriction and, in some cases, independent pre-authorisation.
Two days later, it was reported that the Metropolitan police would begin a six-month pilot of operator-initiated facial recognition, with 100 officers using smartphone-based identity checks during stops. While the state talks about safeguards, facial recognition is moving closer to direct street-level encounters.
Other democracies have shown this is a rights choice, not a technical inevitability. The EU AI Act bans real-time remote biometric identification by law enforcement in publicly accessible spaces, with prohibitions effective from February 2025. The UK is doing the opposite, scaling first, then asking how to legalise the scale.
So, the final question is no longer just whether these systems are biased. The state’s own evidence already tells us that they are uneven, racialised, and dangerous in ways that map directly onto the disproportionality that StopWatch and other like-minded organisations have been documenting for years. The deeper question is simpler, and more political: who consented to this, and when?
If the consultation closes after the vans are funded, after the infrastructure is commissioned, and after the legal architecture is already being drafted around expansion, then this is not public consent. It is consultation theatre.
By Thomas Dixon, StopWatch volunteer
All blogposts are published with the permission of the author. The views expressed are solely the author’s own and do not necessarily represent the views of StopWatch UK.