FaceApp Predicted Different Genders for Similar User Photos with Slight Variations

December 24, 2020

A user reported an incident where FaceApp's algorithm incorrectly predicted different genders for two similar photos with minor differences in eyebrow thickness, highlighting the need for trustworthy AI. This serves as a reminder of the importance of safe and secure AI practices.

For those interested in shaping the future of AI governance and ensuring responsible AI practices through Project Cerebellum, please JOIN US.
This incident invites us to Map such occurrences within the HISPI Project Cerebellum TAIM framework to improve its Measure and Manage capabilities.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
faceapp
Alleged developer
faceapp
Alleged harmed parties
faceapp-non-binary-presenting-users, faceapp-transgender-users, faceapp-users

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/273

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.