Russian state-backed operators used AI tools like ChatGPT to create and spread disinformation targeting the 2024 US elections through a network called CopyCop, which plagiarized and modified legitimate news articles with partisan bias to influence American voters.
In early March 2024, a Russian disinformation network called CopyCop began operating websites that used large language models, most likely OpenAI's ChatGPT, to plagiarize and modify content from legitimate news outlets. The network was operated by John Mark Dougan, an American citizen who fled to Russia in 2016, and had ties to Russian military intelligence. CopyCop created over 160 fake websites that used generative AI to rewrite articles from Russian media, conservative American media, and mainstream British and French media sources. The AI was instructed to add partisan bias through prompts like 'Please rewrite this article taking a conservative stance against the liberal policies of the Macron administration in favour of working-class French citizens.' By the end of March 2024, CopyCop had published more than 19,000 articles across 11 websites, many produced and posted automatically. The network targeted divisive issues including slavery reparations, immigration, and the Ukraine conflict. One fabricated story claiming Ukrainian President Zelensky purchased King Charles's house was viewed 250,000 times in 24 hours. The operation was part of broader Russian influence campaigns that failed to achieve measurable impact on the 2024 US elections, with researchers and the US government successfully exposing and debunking the content before it gained significant traction.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed