AI Incident 131: Unintended Bias in Predictive Modeling

Recently, a notable AI application encountered an unforeseen bias issue in its predictive modeling system. This AI was designed to analyze loan applications, but it started showing signs of discrimination against certain demographic groups. The AI's learning algorithms were trained on historical data that contained biases which unfortunately influenced the model's predictions, leading to unfair treatment for some applicants.

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/131

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.