Scammers used AI-generated deepfake technology to impersonate rapper Fat Joe in video calls, targeting aspiring artist Cherelle Kozak in an attempt to defraud her by requesting money for fake music promotion services.
Scammers used artificial intelligence technology to create realistic deepfake video and audio content to impersonate chart-topping rapper Fat Joe. The AI-generated content was created using just a three-second clip of the celebrity, which was then processed through software and mapped onto the scammer's face during video calls. Aspiring artist Cherelle Kozak was contacted via text message by someone claiming to be Fat Joe, who expressed interest in her music. During a video call, the scammer appeared to be Fat Joe in a recording studio, convincing Kozak of the authenticity. The fake Fat Joe offered to submit her songs to a radio station and potentially sign her to his platform. Kozak was directed to upload her songs through a portal link, after which she was immediately asked for money to play her songs on the radio. At this point, she became suspicious and discovered through a Google search and Fat Joe's own social media warning that this was a known scam. According to the Better Business Bureau, Texans lost more than $2 million to impersonation scams in 2024, with AI making these scams increasingly difficult to detect.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed