Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias
September 18, 2020
Matched TAIM controls
Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls
- MEASURE 2.10 — similarity 0.624, rank 1. TAIM detail and related incidents →
- MAP 1.6 — similarity 0.620, rank 2. TAIM detail and related incidents →
- GOVERN 3.1 — similarity 0.615, rank 3. TAIM detail and related incidents →
- Alleged deployer
- Alleged developer
- Alleged harmed parties
- twitter-users, twitter-non-white-users, twitter-non-male-users
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/103
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.