AI-Generated Fake Audio of Verbal Abuse Incident Circulates of British Labour Leader Keir Starmer

October 8, 2023

An instance of AI misuse was exposed when a deepfake audio clip, falsely portraying UK opposition leader Keir Starmer verbally abusing staff, circulated on social media. Upon investigation, the fabricated audio—complete with deceptive background noise—was discovered and debunked. This underscores the importance of implementing robust AI governance to prevent such incidents, especially as they can negatively impact trust in AI-driven technologies.

Join us at Project Cerebellum, where we strive to foster safe and secure AI practices through our HISPI Project Cerebellum TAIM (Govern) efforts. Be part of the solution—help us Map, Measure, and Manage these types of incidents for a more trustworthy future in AI.
JOIN US

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
unknown
Alleged developer
unknown
Alleged harmed parties
uk-labour-party, keir-starmer

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/601

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.