A Florida woman lost $15,000 to scammers who used AI-generated voice cloning to impersonate her daughter in distress, claiming she needed bail money after a car accident.
In Dover, Florida, Sharon Brightwell received a phone call from what appeared to be her daughter's number, with a sobbing young woman claiming to have been in a car crash involving a pregnant woman while texting and driving. The caller's voice was convincing enough that Sharon believed it was her daughter crying. A man then claimed to be an attorney representing her daughter, stating she was detained and needed $15,000 in bail money in cash. Following specific instructions not to tell the bank what the money was for, Sharon withdrew the funds and placed them in a box for pickup by a driver. The scammers then called again claiming the unborn child had died and demanding another $30,000. Sharon's grandson intervened by calling a family friend who contacted Sharon directly with her real daughter on the line, revealing the scam. The family believes the suspects used videos from Facebook or other social media to create an AI-generated replica of her daughter's voice. Sharon and her husband are recently retired and the stolen money represented their savings. A report was filed with the Hillsborough County Sheriff's Office and the family launched a GoFundMe campaign to recover their losses.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed