Last week I asked why the Home Office was ignoring spit hoods, allowing individual police forces to roll them out on a piecemeal basis. (TL;DR it’s inexcusable, and people are being injured, or worse, as a result.) Another issue the government seems unwilling, or unable, to deal with is Facial Recognition Technology. Unlike spit hoods, it is not potentially deadly. But it matters. Here’s why.
Facial Recognition Technology used at Notting Hill Carnival
This year’s Notting Hill Carnival generated controversy as the Metropolitan Police Service trialled “mobile facial recognition software”. It was the second such trial at the Bank Holiday weekend event.
The police use Facial Recognition Technology to scan the faces of passers-by in public. The software can also use images taken in police station custody suites after arrest.
This has been going on for years. As a result, Paul Wiles, the government’s Biometrics Commissioner, says there are more than 20 million facial images held by the police in various databases. That’s almost 1/3 of the UK’s population, and includes “hundreds of thousands” of innocent people. Chances are, police databases include biometric image data for you and/ or a member of your family.
Why does biometric data matter?
Normally, police get biometric data from suspects during the “booking in” process at a police station custody suite. This includes a DNA sample, fingerprints, and head & shoulder digital photographs. This biometric data is stored on the Police National Database (PND) and other databases for future investigations. Also, and significantly for people who have been unlawfully arrested, it can be part of a police record check.
A record of arrest and biometric data can be devastating to employment prospects, as my client Nigel Lang found out. He lost his job working with vulnerable teenagers after his wrongful arrest, compounding a deeply distressing event. With my help Nigel recovered compensation and, importantly for him, cleared the police’s records of his arrest and biometric data.
The police treat DNA and fingerprint data differently to custody photographs. Under the Protection of Freedoms Act (2012), DNA and fingerprints are automatically deleted if you are arrested and found to be innocent or released without charge.
Custody photographs are not. Local police forces keep these images. They can add them to the Police National Database for use by all police forces in the UK. The police can manipulate the images by adding biometric data to them. This data, which is akin to a digital fingerprint, is also uploaded to police databases. Police can cross-reference it with social media images, CCTV, live video etc.. Unless the police agree to delete them, they keep database images for at least 6 years. But in practice the police keep images indefinitely because rules provide for retention until the subject is 100 years old.
Lord Justice Richards found the Metropolitan Police’s policy of keeping facial images to be unlawful. In RMC & Anor, R (on the application of) v Commissioner of Police of the Metropolis & Ors  he said:
I am not satisfied that the existing policy strikes a fair balance between the competing public and private interests and meets the requirements of proportionality. In my judgment, therefore, the retention of the claimants’ photographs in application of the existing policy amounts to an unjustified interference with their right to respect for their private life and is in breach of art.8.
It should be clear in the circumstances that a ‘reasonable further period’ for revising the policy is to be measured in months, not years.
The government disagreed.
It took 5 years for the Home Office to come up with a policy paper, Custody Images: review of their image and retention. The Biometrics Commissioner heavily criticised it. Among other issues, he noted a fundamental fallacy which undermines the government’s position:
The review suggests that the retention and use of facial images is ‘generally less intrusive (than DNA or fingerprints) as many people’s faces are on public display all the time’. I disagree with that assertion. In fact for that reason the use of facial images is more intrusive because image capture can be done using cameras in public places and searched against government databases without the subject being aware. Facial images are no longer only used solely for custody purposes and image capture and facial searching capabilities have and are being used by the police in public places.
Further Legal Issues
As well as the court finding against the police and the Biometrics Commissioner’s criticism, forces must deal with other overlapping laws, including the:
- right to respect for private life under Article 8 of the Human Rights Act. (As mentioned by LJ Richards in his judgment),
- requirement to avoid discrimination under the Equality Act 2010, and
- Data Protection Act principles. These include rules that personal data shall be processed “fairly and lawfully” and “shall not be kept for longer than is necessary”.
This ought to have been enough for the police to pause their facial recognition programs and reflect. And yet they continue to harvest facial images and add biometric data to them.
It is hard to see why the police are pressing on with facial recognition technology. It is likely to lead to more legal criticism and costly punishment.
Add to this the fact that the Biometrics Commissioner has criticised both the police and the Home Office for failing to carry out testing, procedures, and policies. He is especially concerned that Parliament has not been involved in the process to “reassure the public that their privacy is being properly protected”.
It can’t be right that:
- retention of fingerprints and DNA is subject to the law, but
- facial images and related biometric data are not.
Cressida Dick, Commissioner for the Metropolitan Police, refused to respond to a letter from civil liberties and race relations groups asking her to pause this “shady enterprise” at the Notting Hill Carnival. She ignored them, which makes me wonder if she is truly committed to Peel’s 9 Principles of Policing, as I asked here.
In particular, I don’t know how police can use facial recognition technology without publicising it, or seeking Parliamentary approval, and still meet Principle 2:
To recognise always that the power of the police to fulfil their functions and duties is dependent on public approval of their existence, actions and behaviour and on their ability to secure and maintain public respect.
“Big Brother” Expansion
Liberty, the human rights organisation, found that the real-time facial recognition at the Carnival was a dismal failure, producing only 1 positive match over 4 days. It frequently provided false positives such as confusing men with women, and did not compensate for racial bias.
Despite this and the lack of public and parliamentary scrutiny, the Home Office plans to invest a further £5 million in the technology. Worryingly, this report says “Such technology will, initially, be used in law enforcement. In time, the scope of the deployment may extend to other public sector organisations, the Home Office said.”
Another Home Office Failure
There are clear parallels here with the spit hood situation. The Home Office, through its Centre for Applied Science and Technology (CAST), should have considered spit hoods years ago. It still has not. Letting individual forces decide if, and how, to use these potentially deadly tools is a shameful failure.
The Home Office seems intent on repeating the spit hood mistake. To date, 3 police forces have introduced facial recognition technology without CAST oversight. And to invite tenders from technology companies, spending millions of pounds of taxpayers’ money while avoiding parliamentary scrutiny despite the demands of MPs, suggests a wilful disregard of government duties and the democratic process.
Read more from Kevin Donoghue on the Donoghue Solicitors blog.