AI-Altered Fake Nude Images of High School Girls and Women Reportedly Created and Disseminated in Pensacola, Florida
October 10, 2024
In Pensacola, Florida, an 18-year-old man allegedly used an online AI image-alteration application to digitally "undress" photos of dozens of girls and young women without their consent, creating realistic fake nude images. Some source photos were reportedly taken when the victims were minors. The images were reportedly discovered on the man's phone and subsequently copied and shared with other students by a third party, prompting a police investigation and arrest.
- Alleged deployer
- unnamed-18-year-old-male-student-from-pensacola
- Alleged developer
- unknown-image-generator-developers, unknown-deepfake-technology-developers
- Alleged harmed parties
- unnamed-students-from-pensacola, minors, students, epistemic-integrity
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/1354
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.