AI-generated deepfake audio clips falsely attributed to imprisoned Pakistani politician Imran Khan calling for election boycotts were circulated on social media one day before Pakistan's general elections on February 7, 2024.
On February 7, 2024, one day before Pakistan's general elections, a fabricated audio recording purporting to be imprisoned former Prime Minister Imran Khan calling for an election boycott by his party PTI circulated on social media platforms including X (formerly Twitter). Audio forensics experts confirmed the clip was AI-generated, identifying unnatural white noise, monotone intonation, mechanical-sounding consonants, and evidence of digital manipulation at specific timestamps. Deep-learning models from IIT Jodhpur assigned confidence scores above 0.9 that the audio was artificially created. PTI immediately denied the authenticity and clarified they would not boycott elections. This incident was part of a broader pattern of deepfake videos and audio clips targeting PTI candidates during Pakistan's election period, with multiple fake recordings of party leaders falsely announcing election boycotts. The proliferation of AI-generated disinformation occurred against a backdrop of internet shutdowns that hampered fact-checking efforts, affecting Pakistan's 128 million voters. Similar deepfake incidents had occurred during Bangladesh's recent elections, highlighting the global challenge of AI-generated electoral disinformation.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed