Russian operatives created a fake video using AI-generated synthetic voice technology featuring a fictional Ukrainian troll farm worker claiming to target American elections, as part of a broader disinformation campaign to influence U.S. voters.
Russian disinformation group Storm-1516, which includes veterans of the Internet Research Agency, created a fake video featuring a woman named 'Olesya' who claimed to work for a Ukrainian troll farm targeting American elections. U.S. intelligence analysis found that Olesya's voice was synthetically generated using AI technology. The video falsely alleged that Ukrainian operatives, after a visit from CIA agents, were instructed to prevent Donald Trump from winning the 2024 election. Microsoft identified Storm-1516 as the likely source, describing it as a collection of disinformation experts focused on creating viral videos for American audiences. Since August, Microsoft has identified at least 30 videos produced by this group, with early videos targeting Ukraine but later ones aimed at influencing American politics by appealing to right-wing audiences with anti-Biden messages. The fake video was part of Russia's broader multimedia influence apparatus designed to erode trust in democratic institutions and exacerbate social divisions. While the specific reach of this particular video is not quantified, it represents part of an ongoing campaign to spread disinformation ahead of the U.S. elections.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed