Russian, Iranian, and Chinese state-linked groups created and spread AI-generated disinformation targeting the 2024 U.S. election, including deepfake videos falsely accusing Vice President Kamala Harris of a hit-and-run incident and sexual assault allegations against Governor Tim Walz.
Multiple state-linked disinformation campaigns used AI-generated content to target the 2024 U.S. election. Russian group Storm-1516 created deepfake videos including a false hit-and-run story about Vice President Kamala Harris that received millions of views and was cited over 10,000 times on Twitter in 24 hours. The fake story claimed Harris hit a 13-year-old girl in San Francisco in 2011, featuring an AI-generated victim whose voice had a 97% probability of being AI-created according to TrueMedia analysis. The group also created a video falsely accusing Governor Tim Walz of sexual assault that gained 5 million views on X in under 24 hours. Iranian group Cotton Sandstorm conducted reconnaissance on election-related websites in swing states, while Storm-2035 posted divisive articles posing as local U.S. news outlets. Chinese Spamouflage operations targeted Republican candidates including Senator Marco Rubio and Representative Barry Moore with antisemitic messages and corruption accusations. Microsoft's Threat Analysis Center warned these operations would intensify in the final 48 hours before Election Day, with all three nations using AI to cast doubt on election integrity.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed