HISPI Project Cerebellum
AI Incidents

Synthetic Voice 'Olesya' by Storm-1516 Falsely Accuses Ukraine in U.S. Election Disinformation Campaign

April 1, 2024

Russian operatives used AI to create a fake video and voice of "Olesya," a supposed troll in Kyiv, falsely claiming involvement in U.S. elections to support President Biden. U.S. intelligence confirmed the voice was AI-generated. This disinformation campaign aimed to mislead voters, erode trust in democratic institutions, and influence the 2024 election. The incident involved the group Storm-1516, individuals linked to Valery Korovin, and potential veterans of the Internet Research Agency.
Alleged deployer
valery-korovin, storm-1516, internet-research-agency-veterans, center-for-geopolitical-expertise
Alleged developer
valery-korovin, storm-1516, internet-research-agency-veterans, center-for-geopolitical-expertise
Alleged harmed parties
ukrainian-general-public, joe-biden, general-public, democratic-institutions, biden-presidential-campaign, american-conservatives

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/727

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.