Multiple celebrities, including actor Kim Seon-ho, have been targeted by deepfake scams where AI-generated content impersonates them to demand money from fans and spread false information.
The entertainment industry is facing a growing problem of deepfake misuse targeting celebrities. Actor Kim Seon-ho's agency Fantagio issued a public warning about deepfake videos and impersonation attempts demanding money in the actor's name. The agency clarified that neither the actor nor staff would ever ask for money or personal information through private contact, calling such activities illegal acts. This is part of a broader pattern affecting multiple celebrities. YouTuber Dex's agency Kick the Hurdle Studio previously warned about his likeness being used in deepfake gambling ads. Saram Entertainment reported illegal deepfake content targeting actress Park Gyu-young. Blitzway Entertainment revealed scams impersonating the company and actors including Ju Ji-hoon and Chun Woo-hee for financial gain, discovered both domestically and abroad. Despite repeated warnings and countermeasures from agencies, impersonation and exploitation through deepfakes remain a persistent problem causing distress for celebrities.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed