OpenAI identified and disrupted five online influence campaigns run by state actors and private companies in Russia, China, Iran and Israel that used OpenAI's AI technologies to generate social media posts, translate articles, write headlines and debug computer programs for political manipulation and geopolitical influence operations.
OpenAI reported in May 2024 that it had identified and disrupted five covert online influence campaigns that used its generative AI technologies to manipulate public opinion and influence geopolitics. The campaigns were operated by state actors and private companies from Russia, China, Iran and Israel. The AI tools were used to generate social media posts, translate and edit articles, write headlines and debug computer programs, typically to support political campaigns or influence public opinion in geopolitical conflicts. The Russian Doppelganger campaign used OpenAI technology to generate anti-Ukraine comments in multiple languages and translate pro-Russia articles. A second Russian campaign targeted Ukraine, Moldova, Baltic States and the US via Telegram, using AI to generate comments and debug code for automated posting. The Chinese Spamouflage campaign used the tools to debug code, analyze social media and generate posts critical of Chinese government critics. The Iranian campaign, linked to International Union of Virtual Media, used AI to produce pro-Iranian, anti-Israeli and anti-US content. The Israeli campaign called Zeno Zeno used the technology to create fictional personas and anti-Islamic messages across social media platforms in Israel, Canada and the US. Despite the AI assistance, the campaigns reportedly failed to gain significant traction or expand their reach, with posts receiving few replies and likes.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.