Research and action for fair and accountable policing

About Twitter Instagram Facebook Donate
import_4_facialrecognition.jpg
27.01.2020

I see right through you, facial recognition

We saw it coming, but how far will facial recognition affect stop and search?

How apt, that 2020 is the year that we can see oh so clearly what the Metropolitan Police Service has been up to all along. Live Facial Recognition (LFR) is coming soon to a London borough near you, if you’re not careful.

This has been some time in the making; many constabularies conducted facial recognition trials in recent years. But it seems the landmark ruling delivered in 2019's legal case R (Bridges) v Chief Constable of the South Wales Police was the catalyst for the Met’s decision to press ahead with LFR in the capital.

In short, Lord Justice Haddon-Cave and Mr. Justice Swift of Cardiff High Court judged that although there is no explicit legislative lawful basis for facial recognition technology, its legitimate use instead rests upon ‘well-established common law principles’, namely that of the police’s duty to take necessary steps ‘in order to prevent and detect crime’.*

In this sense, LFR is no different from CCTV, ANPR, or body-worn video cameras. But questions were asked of those devices too, questions that remain pertinent to this day.

I need neither repeat them, nor that a parliamentary committee recommended a moratorium on the use of LFRuntil a legal framework was in place, nor the government’s promise to create that ‘strict legal framework’ for technologies such as biometrics, nor how unpopular it is with the British public, nor the disputes over its effectiveness, nor the fact that even if it were 100% foolproof, it would represent a 'sinister step' towards a surveillance state.

Let’s focus instead on how LFR may affect stop and search, a technique that – like facial recognition – police forces claim is intelligence-led. That intelligence can involve sourcing databases of individuals under suspicion. Yet, it is known that the threshold for the infamous Gangs Matrix database is ‘very low’, requiring just two pieces of ‘verifiable intelligence’, with ‘no clear guidance or criteria’ for deciding what a gang even is. Not so smart.

For LFR, the Met's procedural and guidance documents refer to a Watchlist, for which the inclusion of persons on this database 'needs to be justified based on the principles of necessity and proportionality’. However, this Watchlist 'is normally a subset of a much larger collection of images’, such as a police force’s custody image dataset, which may include the Gangs Matrix, a dataset for which you may not have been involved in any criminal activities to be listed; Amnesty International’s research found a third of individuals on the matrix have 'never even committed a serious offence’. Not necessary, not proportionate.

Still, the Cardiff High Court judges opined, what matters is that police ‘make no more than reasonable use’ of an image in seeking to accomplish ‘the purpose of the prevention and detection of crime, the investigation of alleged offences and the apprehension of suspects or persons unlawfully at large’.

To that end, police procedure insists that even in the event of an Alert (not quite a match – your face must return a similarity score above a predetermined threshold), an individual’s personal data will be deleted ‘within 31 days’, tops. One can only hope that the same institution caught unlawfully retaining an image database of millions of innocent persons harvested in secret can keep its promises this time.

Maybe they learnt some lessons from their trials, though, because those who wish to avoid LFR altogether are free to do so. Pages 11-12 of the procedural document warn officers that ‘the police have no legal powers to direct or compel members of the public to enter a Zone of Recognition’. Scant consolation for the Romford Dodger, but perhaps an acknowledgement that curtailing people's civil liberties in the blink of an eye for no good reason can backfire.

Overall, the Met's efforts to display transparency and standards over facial recognition sit awkwardly with the nature of its introduction to the public sphere. It is common law, but it was never legislated for. This means that although it can be challenged in court (the original case is due for an appeal hearing at the Court of Appeal in London in June),** the Met needs permission from no one – not even our lawmakers – to change the protocol governing its use in London.

I see parallels with the mission creep that occurred under best practice guidance given in relation to Section 60, minus even the token gesture of statutory validity underpinning it. Fewer than half of one percent of searches conducted under that power (< 0.5%) lead to arrest for possession of a dangerous weapon, and reporting of criminal incidents often note that Section 60 is deployed after the event, too late to make a positive difference. In the context of stop and search, using LFR in the war on knife and gun crime risks compounding the Met's vainest attempts to deal with an issue that may be better solved by less intrusive means.

Ultimately, the increasing reliance on technology to tackle crime by the long arm of the law may come either to resemble Minority Report, or an Inspector Gadget complex. We'll see how this plays out.

By Eugene K

Communications volunteer for StopWatch UK

Photo from Pxfuel under a Creative Commons Zero (CC0) licence.


* It’s important to note that the Met Police's LFR is still technically different from South Wales Police AFR (Automated or Assisted Facial Recognition), which is not used in real time.

** A shout out to Big Brother Watch and Liberty for their efforts in holding the actions of police forces using this technology to account.

Support our work

We take one-off donations and regular payments. Any amount we receive helps to support us in our mission and keeps us independent

Donate

Sign up to our newsletter

For regular updates on our activities and to learn how you can get involved with us

Sign up to our newsletter

Subscribe