Fraudsters used deepfake AI technology to create fake investment videos impersonating presidential election candidate Heather Humphreys, with the fabricated content circulating on Meta platforms to lure people into fraudulent investment schemes.
Fraudsters deployed deepfake technology to clone both the image and voice of presidential election candidate Heather Humphreys in fabricated investment videos. The AI-generated content falsely portrayed Humphreys endorsing high-return investment schemes and was primarily distributed through Meta platforms. Bank of Ireland issued warnings about these convincing fake videos, which were designed to exploit public trust in well-known figures to lure unsuspecting individuals into fraudulent schemes. The bank's Head of Fraud, Nicola Sadlier, expressed deep concern about the ongoing spate of scams and warned that more such videos may appear in coming weeks. The incident highlights vulnerabilities in social media platforms' verification processes for financial advertisements, with calls for platforms to verify that advertisers are authorized by recognized regulatory bodies before allowing financial services advertisements to go live.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.