Harmful Stereotyping of Non-Cisgendered People via Text-to-Image Systems
July 3, 2023
Text-to-image systems such as DALL-E are allegedly generating biased and often insulting representations of non-cisgender identities. The systems tend to generate stereotypical and sexualized images when prompted with gender identity terms like "trans," "nonbinary," or "queer," highlighting systemic issues of bias.
- Alleged deployer
- dall-e
- Alleged developer
- openai
- Alleged harmed parties
- non-cisgender-individuals, lgbtq+-community
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/579
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.