In May 2022, the Information Commissioner’s Office (ICO) fined Clearview AI Inc. over £7.5 million for illegally scraping images of people from the internet. But despite the ICO’s fine, those affected by the data breach will not receive a penny.
Clearview AI is an American facial recognition company that gathers facial images and personal data from publicly available information on the internet and social media. The company claims to have the largest known database of facial images, with some figures estimating that it has collected more than 20 billion images.
Clearview scrapped the internet to find images, including those of UK residents, and added these to its global facial recognition database without their knowledge or permission.
There are serious privacy concerns over facial recognition software in the UK. And in the UK, the processing of biometric data (such as images of a person’s face) is only allowed in very explicit circumstances. Consent also must be provided. As such, we believe that Clearview breached UK data protection laws.
If your image has been publicly available online since 2009, including on social media, Clearview very likely breached your data protection rights.
In July 2020, the UK’s Information Commissioner’s Office (ICO) and the Office of the Australian Information Commissioner (OAIC) began a joint investigation into the personal information handling practices of Clearview AI Inc.
The joint investigation was launched after concerns were raised about Clearview’s data collection methods. The company was accused of scraping the images of people from the internet – including social media – and adding these to its database without seeking permission to do so.
Clearview sells access to this data – including to law enforcement agencies – using the facial recognition algorithm to seek matches and track possible suspects.
However, using data in this way is highly controversial, and, in an investigation that lasted over 15 months, the ICO and the OAIC worked together to look at how the technology was being used and whether any formal regulatory action was needed.
The OAIC released its ruling in November 2021, stating that the firm had breached Australians’ privacy.
The main issue was that the data processing was not transparent. As a result, the regulator ordered the company to stop collecting the photos and delete all the pictures of Australian citizens.
In May 2022, the Information Commissioner’s Office fined Clearview AI Inc. over £7.5 million for illegally scraping images of people from the internet.
The ICO found that Clearview AI breached UK data protection laws by failing to:
As well as the fine, the ICO also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is available in the public domain, and to delete the data of UK residents from its systems.
But those affected by the data breach did not receive a penny.
Clearview allows its customers – including the police and businesses – to upload a photograph of someone and try to identify them. It does this by matching the image to those already held in its database.
But there are several ethical and privacy concerns over facial recognition software – especially when it comes to profiling and automated decision-making about individuals. In particular, women and people of BAME groups are being discriminated against by companies that use such tech. This is because leading facial-recognition software packages are much worse at identifying women and people of colour than classifying male, white faces. Big Brother Watch – a British civil liberties and privacy campaigning organisation – described how one black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.
The UK Information Commissioner said that Clearview “not only enables identification” of the people in its database “but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable.”
KP Law has some of the most skilled data breach lawyers in England and Wales. Here are just some of our success stories.
KP Law is a founding member of the Collective Redress Lawyers Association (CORLA). CORLA aims to improve access to justice for claimants by way of collective redress.