AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

October 24, 2025

An AI nudity dataset, NudeNet, reportedly contained Child Sexual Abuse Material (CSAM), including images of identified victims. The dataset was widely used in academic research before the issue came to light. Allegedly, these images were included without proper vetting, putting researchers at legal risk and furthering harm to victims. Upon notification, the dataset was removed.

This incident underscores the need for trustworthy AI governance, safe and secure data practices, and the importance of HISPI Project Cerebellum TAIM (Govern) in preventing such occurrences. For those interested in shaping responsible AI practices and contributing to a safer digital environment, JOIN US.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
academic-researchers, research-institutions, ai-developers, dataset-users, independent-researchers, ai-researchers
Alleged developer
nudenet-dataset-maintainers, nudenet-model-developers
Alleged harmed parties
minors, identified-csam-victims, individuals-subjected-to-sexual-exploitation-imagery, academic-researchers

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/1349

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.