AI-generated deepfake audio of Croatian footballer Luka Modric was used in a fraudulent Facebook video promoting a fake investment platform, targeting Croatian citizens with promises of guaranteed returns.
A fake Facebook page called 'N1 HR' published a video on December 4th featuring AI-generated audio of Croatian national team captain Luka Modric promoting an investment platform promising financial stability and earnings up to 4000 euros monthly. The video included deepfake audio where Modric appeared to discuss war concerns and offer access to a trading platform requiring a minimum investment of 250 euros. The fake audio contained unnatural pauses and incorrect Croatian words like 'semtram', indicating AI generation. The video received 8 shares, 68 comments and 154 reactions. A button below the video directed users to a fake website mimicking Index.hr portal, promoting the 'Immediate Matrix' platform. The scam represents a 'pig butchering' fraud combining investment schemes with cryptocurrency scams. This incident follows similar AI-generated audio scams featuring other Croatian public figures including Health Minister Vili Beros, scientist Ivan Dikic, N1 journalist Nina Kljenak, and Elon Musk. The real N1 television confirmed the page and content were fake and warned users not to fall for the scam attempt.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed