A Port Neches man received a scam call using AI-generated voice cloning technology that mimicked his sister's voice to convince him she was in distress and needed help.
Jace Edgar from Port Neches, Texas received a phone call from an unknown number with a 409 area code. The caller used artificial intelligence to clone his sister's voice, claiming she had been in an accident and needed help. Edgar initially believed the call was genuine, saying it sounded exactly like his sister and the caller knew his name and called him 'brother.' However, Edgar became suspicious when the person stopped responding directly to his questions and would act like they couldn't hear him after about two minutes. He hung up and called his sister directly, confirming she was safe and unaware of the situation. The scammer called back, but Edgar warned them that police were on their way. Corey Kneeland from the Jefferson County District Attorney's Office confirmed this was part of a growing trend in AI-powered scams. Officials noted that unless an actual crime is committed, law enforcement has limited ability to act, and the best defense is avoiding calls from unknown numbers.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed