Glasgow Man Allegedly Used AI Tool to Create and Share Non-Consensual Deepfake Nude Images of Former Classmate
February 1, 2024
In February 2024, a Glasgow man, Callum Brooks, reportedly used an AI-powered image-alteration tool to create deepfake nude images of a woman he knew from school by modifying her publicly posted social media photos. He allegedly sent the fabricated intimate images to two friends without her consent. The woman was reportedly embarrassed and distressed after learning of the images. Brooks pled guilty in Glasgow Sheriff Court and was fined £335.
- Alleged deployer
- callum-brooks
- Alleged developer
- unknown-deepfake-technology-developers, unknown-image-generator-developer
- Alleged harmed parties
- former-classmate-of-callum-brooks, epistemic-integrity
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/1292
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.