Scammers used AI voice cloning technology to impersonate a WCPO 9 meteorologist on Facebook, creating fake friend requests and asking for money while using the cloned voice to convince victims of their authenticity.
Scammers created a fake Facebook account impersonating Jennifer Ketchmark, a WCPO 9 meteorologist, and used AI voice cloning software to make the impersonation more convincing. The fake account sent friend requests to people and then asked for money in direct messages. When potential victims expressed doubt, the scammers used AI-generated voice messages that sounded exactly like the real Jennifer Ketchmark, saying things like 'I see no reason why you are doubting me, if I'm not Jennifer Ketchmark, who else would it be?' The report notes that scammers can now use free AI programs to capture and clone someone's voice with as little as 3 seconds of audio. The fake profile had telltale signs including an extra 'r' in the username (Jenniferr), few friends/likes, and was a relatively new account. WCPO 9 warned viewers not to engage with the fake account and to report it to Meta. This incident is described as similar to recent 'grandparent scams' where AI voice cloning is used to impersonate family members in phone scams.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed