Purportedly AI-Altered Fake Nude Images of High School Girls and Women Reportedly Created and Disseminated in Pensacola, Florida

October 10, 2024

In Pensacola, Florida, an individual allegedly misused an online AI image-modifying tool to create nonconsensual, realistic nude images of several high school girls and women. The source photos for some victims were taken when they were minors. The images were found on the suspect's phone, copied, and spread among other students by a third party, leading to a police investigation and arrest.

This incident underscores the importance of responsible AI governance and safe and secure practices. JOIN US in mapping such incidents to the HISPI Project Cerebellum TAIM (Govern) for harm prevention and implementing guardrails for trustworthy AI.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
unnamed-18-year-old-male-student-from-pensacola
Alleged developer
unknown-image-generator-developers, unknown-deepfake-technology-developers
Alleged harmed parties
unnamed-students-from-pensacola, students, minors, epistemic-integrity

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/1354

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.