Russian Telegram channels circulated a deepfake video featuring Ukrainian commander Andrii Biletskyi making false claims about avoiding identification of fallen soldiers to prevent compensation payments, created using AI-generated audio inserted into original footage.
Russian Telegram channels circulated a deepfake video allegedly featuring Ukrainian Third Army Corps commander Andrii Biletskyi claiming that Ukrainian authorities deliberately avoid identifying fallen soldiers' bodies to avoid paying compensation to families. The video was created using artificial intelligence to generate audio mimicking Biletskyi's voice and inserting it into original footage from May 16, 2025. Evidence of manipulation includes unnatural facial expressions and results from Hive Moderation AI-content detection service indicating 99% of the audio was artificially generated. The original video showed the serviceman discussing different topics entirely, with no mention of fallen soldiers or compensation. The deepfake also falsely identified Biletskyi as commander of the Azov Regiment rather than his actual position as Third Army Corps commander. This represents another example of Russia using the sensitive issue of returning fallen Ukrainian soldiers' bodies as a tool for manipulation and propaganda. The incident was detected and debunked by Ukrainian fact-checkers who identified the technical indicators of AI manipulation and traced the source material.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to conduct large-scale disinformation campaigns, malicious surveillance, or targeted and sophisticated automated censorship and propaganda, with the aim of manipulating political processes, public opinion, and behavior.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed