Facial recognition continues despite concern from MPs and ongoing court case

Profile picture for user ddpreez By Derek du Preez August 13, 2019
Summary:
A number of facial recognition use cases continue to pop up across the UK, despite ongoing protests from MPs and civil liberty groups.

Image of cameras looking at people on the street

Despite express concern from MPs on the influential Science and Technology Committee and strong criticism from civil liberty groups, a number of organisations have confirmed their ongoing use of facial recognition technology over the past week. 

MPs on the Committee recently went as far as to say that all facial recognition trials should be suspended, given the growing evidence from respected, independent bodies that have brought the legal basis for its use into question. 

Facial recognition technology enables the one-to-many matching of near real-time video images of individuals with a curated watchlist of facial images. It has been trialled by the Metropolitan Police and South Wales Police, in particular. 

The latter this week said that police in South Wales would be carrying out a new trial that will place an app on 50 officers phones that can carry out the facial profile matching. 

Deputy Chief Constable Richard Lewis, said: 

This new app means that, with a single photo, officers can easily and quickly answer the question of ‘Are you really the person we are looking for?’ When dealing with a person of interest during their patrols in our communities, officers will be able to access instant, actionable data, allowing to them to identify whether the person stopped is, or is not, the person they need to speak to, without having to return to a police station.

Officers will also be able to verify the identity of a vulnerable person in seconds, rather than hours, resulting in less time digging for information and more time keeping the peace.

I want to stress that our police officers will only be using the new technology in instances where it is both necessary and proportionate to do so and always with the end goal of keeping that particular individual, or the wider public, safe. We have given additional training to the officers who are part of the trial and will closely monitor the use of the app to assess its effectiveness.

However, advocacy group Liberty this week condemned the announcement, pointing to an ongoing court case brought by one of its clients, Ed Bridges, which is yet to be ruled on. Bridges believes his face was scanned by South Wales Police at both a peaceful anti-arms protest and while doing his Christmas shopping. 

Liberty argues that facial scans include biometric data, which is akin to taking someone’s DNA or fingerprints without their consent. They add that there is no legal framework governing the use of the technology and that it has been shown to discriminate against women and BAME people. 

This week, Hannah Couchman, Policy and Campaigns Officer at Liberty, said: 

It is shameful that South Wales Police are rolling out portable facial recognition technology to individual officers while their so-called ‘pilots’ are being challenged by Liberty in court. This technology destroys our anonymity in public spaces, chilling our ability to take part in protests and increasing state control over every one of us.

Far less intrusive means have been used for decades by police to establish a person’s identity where necessary. It’s a gross abuse of power for South Wales Police to roll out routine, on-the-spot biometric checks, and especially in circumstances where a person isn’t suspected of committing any crime at all. This technology is intrusive, unnecessary, and has no place on our streets.

Other examples

It has also been revealed this week that a developer that is responsible for a 67-acre site in the King’s Cross area of London has been using facial recognition technology, without informing the local council. 

As reported first by the Financial Times, a spokesperson for developer Argent said that the tool was used to “ensure public safety” and was one of a “number of detection and tracking methods”. 

The BBC has also confirmed that London’s Canary Wharf is also seeking to trial facial recognition tools. 

Under GDPR, firms must have a clear need to record and use people’s images. In a recent blog post, the Information Commissioner said: 

I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.

Although data protection law differs for commercial companies using LFR, the technology is the same and the intrusion that can arise could still have a detrimental effect. In recent months we have widened our focus to consider the use of LFR in public spaces by private sector organisations, including where they are partnering with police forces. We’ll consider taking regulatory action where we find non-compliance with the law.

Chair of the Science and Technology Committee, Norman Lamb MP, also recently urged the government to legislate for the proper use of facial recognition technology. He said: 

The legal basis for automatic facial recognition has been called into question, yet the Government has not accepted that there’s a problem. It must. A legislative framework on the use of these technologies is urgently needed. Current trials should be stopped and no further trials should take place until the right legal framework is in place.