A scammer used AI-generated videos featuring the face and voice of Miami Beach realtor Andres Asion to conduct a romance fraud against a woman in the UK for approximately one year.
A woman in the United Kingdom became the victim of a romance scam involving AI-generated deepfake videos. The scammer created videos showing the face of Miami Beach realtor Andres Asion with dubbed voice saying the victim's name, making the deception highly convincing. The fraud continued for approximately one year before being discovered. The victim contacted Asion directly through his business WhatsApp line, believing they had been in a romantic relationship and that she was visiting Miami to meet him. During their in-person meeting, both parties realized the videos had been artificially generated using AI technology. Asion confirmed that friends and family who viewed the fake videos said they would have believed they were authentic. The suspected scammer's phone number was disconnected when Local 10 News attempted to contact them. The incident highlights concerns about AI being used for fraudulent purposes and the need for protective measures against such misuse.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed