Amazon Abandons Bias-Prone Recruitment Tool: A Case for Responsible AI
In an effort to uphold principles of trustworthy AI, Amazon has recently discarded a controversial recruitment tool that exhibited gender bi...
Read moreEvidence-based Transparent For governance
In an effort to uphold principles of trustworthy AI, Amazon has recently discarded a controversial recruitment tool that exhibited gender bi...
Read moreThe e-commerce giant Amazon introduced an AI tool for recruiting candidates but was compelled to shut it down due to gender bias. The system...
Read moreRecently, Amazon announced the termination of an AI recruitment tool that exhibited gender bias in its hiring process. This move reflects th...
Read moreA recent study found that Amazon's AI-powered recruiting tool may have inadvertently disadvantaged female applicants for technical roles. Th...
Read moreIn a move towards more responsible AI governance, Amazon discontinued the use of an AI recruitment tool following reports of sexist bias. Th...
Read moreRecent reports suggest that Amazon has shut down an internal recruiting tool powered by AI, which displayed bias against women candidates. T...
Read moreIn a move towards promoting safe and secure AI, Amazon has decided to scrap an internal tool due to its sexist bias. The AI model was design...
Read moreIn a recent development, Amazon abandoned an AI tool used for recruitment due to concerns about gender bias. This incident underscores the n...
Read moreExamine how AI, trained on biased data, can perpetuate gender stereotypes. Learn about responsible AI practices for reducing bias and promot...
Read moreDelving into recent AI incidents, this article scrutinizes the alarming tendency of technology to mimic human prejudice and malfeasance. The...
Read moreAmazon's recruiting tool, designed to streamline the hiring process, was abruptly shut down due to its gender bias against female candidate...
Read moreA recent report reveals that Amazon decided to scrap a machine learning tool due to concerns about gender bias in its recommendations. This...
Read moreData source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.