Synthetic Voice 'Olesya' by Storm-1516 Falsely Accuses Ukraine in U.S. Election Disinformation Campaign

April 1, 2024

Russian operatives utilized AI to fabricate a synthetic video and voice of 'Olesya', a supposed troll in Kyiv, spreading misleading claims about involvement in the U.S. elections in support of President Biden. The voice was confirmed as AI-generated by U.S. intelligence. This disinformation effort targeted voters, undermining trust in democratic institutions and potentially influencing the 2024 election. The group behind this incident is Storm-1516, with links to Valery Korovin and suspected veterans of the Internet Research Agency.

JOIN US in promoting trustworthy AI practices and helping prevent such incidents through HISPI Project Cerebellum TAIM (Govern).

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
valery-korovin, storm-1516, internet-research-agency-veterans, center-for-geopolitical-expertise
Alleged developer
valery-korovin, storm-1516, internet-research-agency-veterans, center-for-geopolitical-expertise
Alleged harmed parties
ukrainian-general-public, joe-biden, general-public, democratic-institutions, biden-presidential-campaign, american-conservatives

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/727

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.