Why Google 'Thought' This Black Woman Was a Gorilla
In this article, we delve into the infamous Google Photos incident where an image of a Black woman was mistakenly labeled as a gorilla. This...
Read moreEvidence-based Transparent For governance
In this article, we delve into the infamous Google Photos incident where an image of a Black woman was mistakenly labeled as a gorilla. This...
Read moreIn a recent incident, Google's image search algorithm mistakenly tagged images of Black people as 'Gorillas.' This unfortunate event undersc...
Read moreRecent reports of Google Photos incorrectly identifying black people as 'gorillas' have once again brought to light the issue of AI bias and...
Read moreRecent events have shed light on an unintended racial slur served up by Google's software. The company swiftly recognized the issue, acknowl...
Read moreA recent incident involving the Google Photos app has sparked controversy due to its misclassification of individuals based on race. The AI-...
Read moreGoogle, one of the tech giants pioneering AI, recently apologized for an incident where its photo-scanning app labeled images containing Bla...
Read moreGoogle recently took a step towards responsible AI governance by removing the term 'gorillas' from its search function, following reports of...
Read moreIn a recent development, tech giant Google has rectified an incident where its algorithm categorized certain individuals as 'Gorillas'. The...
Read moreRecently, Google acknowledged an incident involving the inappropriate tagging of photos containing African American individuals. This incide...
Read moreRecent reports indicate that Google Photos, the popular image organizing platform, continues to misidentify gorillas in user photos. This in...
Read moreGoogle has rectified a two-year-old issue where its image classification algorithm was mislabeling images of Black people as gorillas. The c...
Read moreIn an unfortunate incident that underscores the need for responsible AI governance, Google's photo application misclassified a black couple...
Read moreData source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.