AI tools were used to create and spread a fabricated story about a missing Boy Scout named Eric Langford, which went viral on TikTok and other platforms, misleading millions of viewers who believed the false narrative.
In December 2025, a YouTube channel called UNKNOWN Files created a 28-minute fictional story about a 14-year-old Boy Scout named Eric Langford who allegedly disappeared in New York's Adirondack Mountains in 1989 and reappeared in 2001 after being held captive. The story was generated using artificial intelligence tools for script writing, image creation, voice narration, and video production. A TikTok user then edited this content into six shorter clips that collectively received millions of views. The fabricated story included AI-generated images of a supposed Boy Scout and surveillance footage of a man entering a police station. The false narrative spread further through AI-generated WordPress blog articles shared on Facebook pages managed from Vietnam. Fact-checkers from Snopes and Lead Stories investigated and confirmed the story was entirely fictional, noting that no newspaper records existed of any such disappearance. Detection tools suggested AI involvement in both the text and images, though such tools can be unreliable.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed