News

Ubers use of facial identification software can lead to breaches of data protection legislation

In early 2021, Uber hit the headlines after it lost a landmark case relating to workers’ rights. And, at Keller Postman UK, we are helping Uber drivers to claim compensation after the Supreme Court ruled that Uber drivers are workers, not self-employed contractors and should, therefore, be entitled to the National Minimum Wage and holiday pay. As a result, Uber may have to pay its current and former drivers thousands of pounds in backpay. You can find out more about this case here.

But it does not look like Uber will be out of the news any time soon as the company is now facing another legal challenge. On this occasion, the issue concerns BAME drivers who claim that Uber’s facial identification software is discriminating against them. If shown to be true, this is not only a workers’ right issue, but also a clear breach of the General Data Protection Regulation (GDPR).

Uber technology is erroneously firing BAME drivers

Drivers working for Uber and Uber Eats must use facial identification software to access the Uber system. The tech is designed to check and verify that driver accounts are not being used by anyone other than licensed individuals.

To work, the tool requires drivers to submit a photograph of themselves, which is checked against their driver’s licence. Once up-and-running, drivers are regularly prompted to take real-time selfies of themselves for verification. These facial checks are then matched against the account holder’s profile picture. If a driver fails the ID check, their account is immediately suspended.

Some BAME drivers have claimed that this technology is costing them their livelihoods as the software is incapable of recognising their faces. For example, simply growing or shaving a beard can lead to a failed check. After a failed ID test, some drivers have been threatened with termination, had their accounts frozen and left unable to work, or even permanently fired. And, while Uber claims that it carries out manual human checks against failed verifications, the drivers allege that the process is automated and that they are left without any right to appeal.

To make a bad situation worse, we have heard of cases where an Uber driver has been fired following a failed ID verification and lost his job, his Uber licence, and his private license.

The problem with facial recognition software

There are several ethical and privacy concerns when it comes to the use of facial recognition software. But one of the biggest issues is performance when it comes to identifying people with darker skin.

In 2018, an influential paper found that leading facial-recognition software packages were much worse at identifying the gender of women and people of colour than at classifying male, white faces. Similar software to that used by Uber had a 20.8% failure rate for darker-skinned female faces, 6% for darker-skinned males, and 0% for white men.

Big Brother Watch – a British civil liberties and privacy campaigning organisation – described how one black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.

While, in 2019, a black driver in the US sued Uber after he was fired when the software did not recognise him in “pitch darkness”, forcing him to lighten his selfies. Since then, concerns over racial and gender bias have not disappeared.

Facial recognition software and GDPR

The GDPR (General Data Protection Regulation) introduced strict personal data protection and privacy rights for individuals. With significant developments in facial recognition software and more and more applications hitting the market, issues of data privacy and protection regarding our faces will not go away any time soon. Especially with facial images prone to misuse such as hacking and identity theft. But, as Uber has shown, it is not just data privacy we need to worry about.

Article 22 of the GDPR concerns “Automated individual decision-making, including profiling”. And it is here that Uber is likely breaking the law.

Under this legislation, people “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”.

While this does not apply “if necessary, for entering into, or performance of, a contract between the data subject (Uber drivers) and a data controller (Uber)”, the entering of a contract is not good enough reason for negatively affecting an individual where special categories of personal data are involved. According to the GDPR, biometric data – such as facial images – constitutes a ‘sensitive’ category of personal data.

In short, the processing of biometric data, and the use of automated individual decision-making, including profiling, are only justified in very explicit circumstances. By discriminating against BAME drivers and automatically making decisions that harm them, Uber’s technology is not GDPR compliant.

Taking legal action against Uber

We are supporting Uber drivers in England & Wales who have GDPR concerns over Uber’s facial recognition software, algorithmic accountability, and automated decision-making processes.

To take action, register with us in confidence and tell us about your experience. We act on a no-win, no-fee basis, so you have nothing to lose.

Contact us to discuss the Uber GDPR violation.

Keller Postman

Share
Published by
Keller Postman
4 years ago

Recent Posts

Latest Data Breach Round-Up – June 2024

In our regular update, we provide a roundup of some of the data breaches and… Read More

5 months ago

Join our MOVEit/ Zellis Data Breach Action

We have launched a group action against MOVEit/Zellis. Group actions can be a powerful tool… Read More

5 months ago

One year on – the extent of the MOVEit data hack is just becoming clear

The number of organisations affected by the MoveIt Data Breach is still rising, despite the… Read More

5 months ago

Join our 23andMe Data Breach Action

We have launched a group action against 23andMe. Group actions can be a powerful tool… Read More

5 months ago

ICO and Canadian counterpart to investigate 23andMe data breach

The Information Commissioner’s Office (ICO) has launched a joint investigation into the 23andMe data breach… Read More

5 months ago

Join Our Capita Data Breach Action

We have launched a group action against Capita. Group actions can be a powerful tool… Read More

5 months ago