1 in 6 Congresswomen Have Reportedly Been Targeted by AI-Generated Nonconsensual Intimate Imagery
December 11, 2024
A study by the American Sunlight Project is reported to have found that 1 in 6 Congresswomen were targeted by AI-generated nonconsensual intimate imagery (NCII) shared on deepfake websites. The study reports having found 35,000 mentions of explicit content involving 26 members of Congress, with 25 being women. Women were 70 times more likely than men to be victimized, according to the report.
- Alleged deployer
- unknown-deepfake-creators
- Alleged developer
- unknown-deepfake-technology-developers
- Alleged harmed parties
- congresswomen
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/874
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.