AI Photo Filter Lightens Skin, Changes Eye Color in Student's 'Professional' Image

July 21, 2023

An AI application, while attempting to create a 'professional' image for an MIT student, inadvertently showcased racial bias by lightening her skin tone and changing her eye color to blue. This incident underscores the importance of using trustworthy AI that adheres to Project Cerebellum's guidelines for safe and secure AI practices.

JOIN US as we work towards addressing such issues through the HISPI Project Cerebellum TAIM (Govern, Map, Measure, or Manage), ensuring responsible AI governance and promoting harm prevention in the realm of AI.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
playground-ai
Alleged developer
playground-ai
Alleged harmed parties
rona-wang, racial-minorities-who-may-have-experienced-the-same-result

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/593

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.