A Los Angeles woman was scammed out of her life savings totaling over $431,000 after scammers used AI-generated deepfake videos impersonating General Hospital actor Steve Burton to deceive her into believing she was in a romantic relationship with him.
Abigail Ruvalcaba, a Los Angeles woman diagnosed with Bipolar Disorder, received multiple messages and deepfake videos on Facebook Messenger and WhatsApp purporting to be from General Hospital actor Steve Burton. The scammers used AI technology to create convincing video content that replicated Burton's voice, which the actor himself confirmed sounded '100%' like him when shown an example. Ruvalcaba believed she was in love and planning a future with Burton, initially sending over $81,000 before selling her family's condo for $350,000 and giving all proceeds to the scammer. The scammer claimed he needed money after his home was destroyed in a fire and sent messages about planning a future together, including references to a 'beach house'. As a result, Ruvalcaba is now in complete debt and will have to file for bankruptcy, potentially losing her family home. Her family is pursuing legal action to reverse the condo sale, which was subsequently flipped to another owner. Burton had previously posted warnings on social media about similar scams targeting his followers.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed