null Skip to main content
LAST CHANCE: UPGRADE SHIPPING BY 2 PM FOR PRIORITY DISPATCH TODAY!
00 HOURS
28 MINUTES
52 SECONDS

Please Note: Orders placed now will join a queue and aim to be dispatched on Monday, 6th January 2025.

​Settlement Awarded in Uber Eats Facial Recognition Bias Case

​Settlement Awarded in Uber Eats Facial Recognition Bias Case

Posted by Emma on 26th Mar 2024

An Uber Eats driver, subjected to racially biased facial-recognition technology, has secured a settlement. Pa Edrissa Manjang's experience with the app's Microsoft-powered verification system led to his wrongful account suspension in 2021 due to "continued mismatches". This incident highlights a critical oversight in applying artificial intelligence in the workplace.

UBER Eats Delivery Cyclist Riding Through a Busy Oxford Road in Manchester shopblocks, CC BY 2.0, via Wikimedia Commons

Initially, Manjang's registration process in November 2019 did not frequently require selfie verifications. However, as the app's security measures intensified, these requests became more frequent, culminating in suspending his account under the pretext of safeguarding all app users. Uber stated, "Our real-time ID check is designed to help keep everyone who uses our app safe and includes robust human review."

Supported by the Equality and Human Rights Commission (EHRC) and the App Drivers and Couriers Union (ADCU), Manjang's case raised significant concerns about income deprivation through AI-driven processes deemed racially discriminatory. The ADCU emphasised the urgency of protecting workers' rights amid rapidly advancing AI technologies, denouncing the excessive verification requests as racial harassment.

112,113,118,122,125,126,131,116

Reinstated and currently working for Uber Eats in Oxfordshire, Manjang reflects on the settlement as a resolution to a challenging chapter in his life. His ordeal casts a critical light on the potential pitfalls of AI, especially for gig economy workers on the lower end of the pay scale, advocating for improved rights and protections against AI biases.

Echoing Manjang's concerns, Baroness Falkner of the EHRC criticised the lack of transparency and accountability in the processes that affected Manjang's employment, highlighting the necessity for precise mechanisms to challenge and understand such AI-driven decisions. Microsoft has acknowledged the limitations of its facial recognition technology, especially concerning ethnic minorities, reinforcing the need for vigilance and correction in deploying these technologies across various sectors.

Manjang's case not only underscores the challenges faced by ethnic minorities in the gig economy but also champions the cause for equitable AI application in the workplace. It reminds us of the ongoing need to scrutinise and refine AI and machine learning technologies to prevent discrimination and ensure fairness for all workers.

We encourage our readers to share their thoughts and comments below on the implications of this case for the future of AI in the workforce and the protections needed for gig economy workers.

112,113,118,122,125,126,131,116
Add 1 more curry sauce for extra savings!