UK follows Australia in clamping down on US facial recognition firm Clearview AI with £17 million data privacy fine

Stuart Lauchlan Profile picture for user slauchlan November 30, 2021 Audio mode
Summary:
The UK privacy regulator isn't happy with Clearview AI Inc's operational strategy. Nor is her Australian counterpart. But Clearview AI reckons it's being misunderstood.

privacy

The UK has followed in Australia’s footsteps as its privacy regulator imposes a provisional £17 million fine on US facial recognition firm Clearview AI Inc for breaching data protection regulations.

Clearview AI Inc, which pitches itself as the ‘World’s Largest Facial Network’ has been the subject of a joint investigation by the UK’s Information Commissioner’s Office (ICO) and the Office of the Australian Information Commissioner (OAIC). The probe centered on the firm’s use of images scraped from the internet and its use of biometrics for facial recognition. The firm has a database of over 10 billion images, against which customers can provide an image to be used to carry out biometric searches on their behalf.

The ICO has determined that the database is likely to include the data of “a substantial number” of  UK citizens and that this data may have been collected without the knowledge of those citizens. The preliminary ruling stemming from this investigation concludes that Clearview AI Inc appears to be in breach of the UK data protection regime in a number of ways:

  • Failing to process the information of people in the UK in a way they are likely to expect or that is fair;
  • Failing to have a process in place to stop the data being retained indefinitely;
  • Failing to have a lawful reason for collecting the information;
  • Failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
  • Failing to inform people in the UK about what is happening to their data; and
  • asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed.

Clearview AI Inc has ceased to offer its services in the UK, but Information Commissioner Elizabeth Denham says that the proposed £17 million penalty is necessary:

I have significant concerns that personal data was processed in a way that nobody in the UK will have expected. It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking. UK data protection legislation does not stop the effective use of technology to fight crime, but to enjoy public trust and confidence in their products technology providers must ensure people’s legal protections are respected and complied with.

Clearview AI Inc’s services are no longer being offered in the UK. However, the evidence we’ve gathered and analysed suggests Clearview AI Inc were and may be continuing to process significant volumes of UK people’s information without their knowledge. We therefore want to assure the UK public that we are considering these alleged breaches and taking them very seriously.

Clearview AI Inc still has an opportunity to appeal the preliminary ruling and no final decision is likely to come before the middle of next year.  In a statement, company lawyer Kelly Hagedorn confirmed that an appeal was under consideration:

The UK ICO Commissioner's assertions are factually and legally incorrect. The company is considering an appeal and further action. Clearview AI provides publicly available information from the internet to law enforcement agencies. To be clear, Clearview AI does not do business in the UK, and does not have any UK customers at this time.

Damned down under

The preliminary ruling from the UK’s ICO comes only weeks after its Australian counterpart found that Clearview AI breached the Australian Privacy Act 1988 by:

  • Collecting Australians’ sensitive information without consent;
  • Collecting personal information by unfair means;
  • Not taking reasonable steps to notify individuals of the collection of personal information;
  • Not taking reasonable steps to ensure that personal information it disclosed was accurate, having regard to the purpose of disclosure;
  • Not taking reasonable steps to implement practices, procedures and systems to ensure compliance with the Australian Privacy Principles.

The OAIC ordered an end to collecting facial images and biometric templates from Australian citizens and to destroy any that have already been collected for the database. Australian Information Commissioner and Privacy Commissioner Angelene Falk said:

When Australians use social media or professional networking sites, they don’t expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes. The indiscriminate scraping of people’s facial images, only a fraction of whom would ever be connected with law enforcement investigations, may adversely impact the personal freedoms of all Australians who perceive themselves to be under surveillance.

She added:

The covert collection of this kind of sensitive information is unreasonably intrusive and unfair. It carries significant risk of harm to individuals, including vulnerable groups such as children and victims of crime, whose images can be searched on Clearview AI’s database.

By its nature, this biometric identity information cannot be reissued or cancelled and may also be replicated and used for identity theft. Individuals featured in the database may also be at risk of misidentification. These practices fall well short of Australians’ expectations for the protection of their personal information.

Misunderstood? 

For its part, Clearview AI pitches itself as a force for societal good. On its website, it says it is “dedicated to innovating and providing the most cutting-edge technology to law enforcement to investigate crimes, enhance public safety and provide justice to victims”. The mission statement goes on:

That's why we developed a revolutionary, web-based intelligence platform for law enforcement to use as a tool to help generate high-quality investigative leads. Our platform, powered by facial recognition technology, includes the largest known database of 10+ billion facial images sourced from public-only web sources, including news media, mugshot websites, public social media, and other open sources.

Our solutions allow agencies to gain intelligence and disrupt crime by revealing leads, insights and relationships to help investigators solve both simple and complex crimes, increase officer and public safety, and keep our communities and families safer.

Founder and CEO Hoan Ton-That clearly isn’t happy with how the UK and Australia have interpreted his firm’s intentions. Indeed he sounds positively peeved in an emotive statement:

I grew up in Australia and have long viewed the UK as an important, majestic place—one about which I have the deepest respect. I am deeply disappointed that the UK Information Commissioner has misinterpreted my technology and intentions. I created the consequential facial recognition technology known the world over.

My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts. 

It breaks my heart that Clearview AI has been unable to assist when receiving urgent requests from UK law enforcement agencies seeking to use this technology to investigate cases of severe sexual abuse of children in the UK. 

My take

I have a very strong view on this, but as the UK ruling is still open to appeal, it would be inappropriate to comment on the specifics of the arguments being made here by either party. That said, this looks to be a benchmark case that will set an important precedent, one way or another. It also, not for the first time, highlights the difference between US attitudes to this sort of practice and that of much of the rest of the world. We will return to this topic without doubt.

Loading
A grey colored placeholder image