A viral Facebook video used deepfake technology to create a fake news report featuring a Kenyan doctor promoting a dubious health product, falsely claiming pharmaceutical companies destroyed his house for criticizing them.
A deepfake video circulated on Facebook featuring manipulated footage of Citizen TV news anchor Swaleh Mdoe and a fake doctor named Kiprono Chepkurui. The video claimed that pharmaceutical companies destroyed the doctor's house in an explosion due to his criticism of their products. The video then showed the fake doctor promoting a miracle cure for chronic conditions, claiming it could instantly normalize blood pressure and resolve 9 out of 10 chronic conditions. The deepfake used actual footage of the news anchor but altered his words and voice, while the doctor footage was taken from a 2017 YouTube video of a US medical student, with the words and lip movements digitally manipulated. The explosion footage was revealed to be from an unrelated house explosion in Ohio, USA on November 20, 2024. The video accumulated over 497,000 views and was designed to exploit health concerns to sell an unproven product. Investigation revealed no doctor named Kiprono Chepkurui exists, and the video contained multiple red flags associated with health scams, including promises of unrealistic results without scientific evidence.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed