AI Incident #105: Unintended Bias in Autocomplete Suggestions
Recent findings from a large-scale study reveal an unintended bias in autocomplete suggestions provided by a popular AI search engine. The biases, although minor, disproportionately affected certain ethnicities and genders in search queries related to employment and education. This incident underscores the importance of responsible AI governance and trustworthy AI models, emphasizing the need for ongoing harm prevention measures and robust guardrails for AI.
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/105
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.