Clearview AI – Data Protection Enforcement

The Information Commissioner’s Office (ICO) has fined a US company, Clearview AI Inc £7,552,800 for using images of people in the UK and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition. This raises important questions about both the jurisdiction of the UK GDPR and the practical implications of extra-territorial enforcement. 

Who are Clearview AI?

Clearview AI is an American company headquartered in New York that boasts one of the world’s largest databases of people’s faces. Clearview’s service allows customers, including government and public entities such as the police, to upload an image of a person to the Clearview AI app, which is then checked for a match against all the images contained in Clearview AI’s database. The app then provides a list of images that have similar characteristics to the photo provided by the customer, with a link to the websites from where the images came from.

Although Clearview AI’s services have primarily been used by law enforcement agencies, the tool is also utilised for intelligence purposes, with Clearview AI offering its services to the Government of Ukraine to assist in the identification of combatants and deceased on both sides of the conflict.

Why were they fined?

After an investigation, it was determined that Clearview AI Inc collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. The Data Subjects (the individuals whose personal data was being processed) were not informed that their images were being collected or used in this way. Given the high number of UK internet and social media users, Clearview AI Inc’s database was likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.

Although Clearview AI does not have any presence in the UK and no longer offers its services to UK organisations, the company has customers in other countries and was found to still be using the personal data of UK residents. Although Clearview does not have an establishment in the EU, it was determined that Clearview AI was subject to the General Data Protection Regulation (UK GDPR) by monitoring data subjects’ activity on the web (Article 3(2)(b) GDPR), as it “does not offer a snapshot [of individuals], but evidently also archives sources over a period of time”.

Although Clearview AI refused to disclose how many of the 20 billion images in its database relate to UK residents, it was determined that Clearview AI takes no steps to exclude UK residents from its database and that it was therefore “inevitable” a large number of images would be from the UK.

Details of the contraventions

The conclusions on the breach of UK GDPR/GDPR were clearly outlined by the ICO, and Clearview AI did not even attempt to argue that it was compliant with the legislation. The ICO suggests that it would have been “hopeless” for Clearview to argue compliance, as they found that they had breached the legislation in a number of ways:

  • Fair and lawful (Art 5(1)(a)): The individuals would not be aware that this processing had taken place. In particular, the images may not have been uploaded by the individual so there can be no assumption the individual has consented for them to be made public. It was also discovered that the images remain in Clearview AI’s database even if they are removed from the internet.
  • Retention (Art 5(1)(e)): Clearview AI did not have a retention policy and did not appear to have any process to delete old data.
  • Legal basis and special category personal data (Art 6 and 9(1)): Facial biometrics count as “special category data”, so can only be processed if a condition in Art 9 is satisfied. There is also no legal basis for this processing under Art 6.
  • Transparency (Art 14): The individuals whose images were obtained were not informed of the processing.
  • Rights (Art 15-22): To have an image removed, individuals had to submit a photograph, which the Information Commissioner considered was a “significant fetter” on that right. In any event, Clearview AI had ceased to delete images on request.
  • DPIA (Art 35): No data protection impact assessment had been carried out.

John Edwards, UK Information Commissioner, said:

“…The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”

“…People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement…”

UK GDPR Enforcement

In consideration of the above, the ICO issued a Monetary Penalty Notice fining Clearview AI £7,552,800.

Although one of the largest fines the ICO have ever enforced, it is surprisingly low when considering the number and impact of the breaches. Clearview AI appears to have not only failed to comply with the GDPR but disregarded it entirely. The ICO did propose a fine of £17.5m originally, however, no reason has been given for the significant discount.

In addition to the monetary enforcement, the ICO also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

 Other Enforcement

The UK was not the first to discover the failings of Clearview AI to comply with domestic and international data legislation. Clearview has been investigated by a number of other Data Protection Authorities (DPA) listed below.  


In February 2021, the Swedish (IMY) DPA determined that Clearview AI had been used by the local police on a number of occasions. The IMY held that the Police had unlawfully processed biometric data for facial recognition and failed to conduct a legally-mandatory DPIA. Therefore, the Data Protection Authority imposed an administrative fine of 2.500.000 SEK (approximately 250.000 EUR) on the Police Authority.[1]


This year, the Finish DPA announced that it had determined that the Finish National Police Board has also unlawfully processed biometric data of potential victims of child sexual abuse through a trial use of Clearview AI’s automated FR technology. Therefore, the DPA ordered the National Police Board to notify a data breach to data subjects whose identity was known and to request Clearview AI to remove police-transmitted data from its systems[2].


The DPA from the German State of Hamburg (HmbBfDI) ruled that the processing of biometric data collected and made available as a service by Clearview AI was unlawful, given the lack of a valid legal basis for the processing of such data. It observed that Clearview AI processes data subjects’ biometric data (under Article 4(14) GDPR), as it “uses a specially developed mathematical procedure to generate a unique hash value of the data subject which enables identification.” The investigation and subsequent decision were triggered by a data subject complaint, which was based on the fact that he had not provided consent for the processing of his biometric data. The DPA determined that Clearview AI, even though it does not have an establishment in the EU, was subject to the GDPR by virtue of the monitoring of data subjects’ activity on the web (Article 3(2)(b) GDPR). Therefore, the DPA ordered Clearview AI to delete all of the complainant’s personal data[3].


The Commission nationale de l’informatique et des libertés (CNIL) has also ordered Clearview AI to stop collecting facial images of persons in France from the internet to feed the company’s database that trains its facial recognition software, and to delete the previously collected images, both within two months. This was due to the unlawful nature of the processing, given that there was no appropriate legal basis under Articles 6(1) and 9(2) of the GDPR for the collection and use of biometric data. To establish its competence over the processing operations carried out by Clearview AI, the CNIL used the same criterion under Article 3(2)(b) GDPR as the HmbBfDI to determine the extraterritorial application of the GDPR, combined with the fact that the controller did not have a lead DPA in the EU as per Article 56(1) GDPR[4].


More recently, the Italian Garante reached similar conclusions and imposed similar corrective measures in its decision to fine Clearview AI for a total of 20.000.000 EUR. The DPA grounded its decision on the lack of a proper legal basis — as legitimate interests did not qualify— as well as for failure to comply with transparency requirements for biometric data processing and monitoring of persons in the Italian territory. Furthermore, Clearview AI was found to be in breach of the purpose and storage limitation principles. Like the CNIL and ICO, the Garante ordered the company to delete the illegally collected personal data and prohibited it from further collecting and processing information about Italian residents through web scraping techniques and its facial recognition system. The Garante also ordered the controller to designate a representative in the EU to be addressed in addition to or instead of the US-based controller, as the company did not have an establishment in the EU but was subject to the GDPR via both Article 3(2) targeting criteria.[5]

What next?

Clearview AI are yet to publicly respond to the enforcement. Those enforcements brought outside of the EU including a case in Australia and the US after an exposé of Clearview’s practices were released in the New York Times. It has been suggested that Clearview disputes the EU DPA’s jurisdiction and may possibly ignore the penalties and enforcement. Clearview AI was due to pay the fine by 17 June, however little has been published online to determine whether they have done so.

Another factor to consider is that it is arguably impossible to determine who is a UK resident, and it is unlikely that Clearview AI will be able to comply with the enforcement notice. Mitigating measures such as constructing a ‘geofence’ or reviewing facial vectors to ensure no metadata is associated with the UK may reduce the likelihood of UK personal data being captured, however, this does not give a clear-cut answer on which data related to UK residents. 

What does this mean for your business?

  1. If you are using external software (or data gathering capability) ensure you as a company understand the entire data flow. As you can see, the various Data Protection Authorities sanctioned the users of the software in some cases, not just Clearview. 
  2. Ensure you conduct a Data Protection Impact Assessment (DPIA) and regularly update and review it. Producing a DPIA is an ongoing requirement if you are continuing to gather data, not just a one off. As part of the DPIA, you should also consider what data you are holding and whether you should keep or delete it after a period of time. 
  3. Combining data such as names with images can make data personally identifiable information that puts your business activities within GDPR. Ensure you understand the distinctions.
  4. Regardless of where in the world you are based, you may still be within the remit of UK GDPR/EU GDPR if you are processing data of UK and European citizens or residents.

The case is certainly monumental in the world of data protection and should encourage international companies involved in UK and EU citizens’ personal data to take a look at their practices and compliance with data protection legislation.

[1] IMY, Beslut efter tillsyn enligt brottsdatalagen – Polismyndighetens användning av Clearview AI, DI-20202719, A126.614/2020, February 10, 2021, available at beslut-tillsyn-polismyndigheten-cvai.pdf.

[2] See also Office of the Data Protection Ombudsman, Poliisille huomautus henkilötietojen lainvastaisesta käsittelystä kasvojentunnistusohjelmalla, September 28, 2021, available at

[3] HmbBfDI, Az.: 545/2020; 32.02-102, January 27, 2021, available at f iles/2021-01/545_2020_Anh%C3%B6rung_CVAI_ENG_Redacted.PDF.

[4] Commission nationale de l’informatique et des libertés, Decision n° MED 2021-134 of 1st November 2021 issuing an order to comply to the company CLEARVIEW AI, available at f iles/decision_ndeg_med_2021-134.pdf.

[5] Garante, Ordinanza ingiunzione nei confronti di Clearview AI — 10 febbraio 2022 [9751362], available at https://