Two young girls died after participating in TikTok's 'blackout challenge' which was promoted to them by the platform's algorithm, leading to a lawsuit claiming TikTok's recommendation system intentionally pushed dangerous content to children.
In 2021, two young girls died after participating in TikTok's 'blackout challenge,' which encouraged users to choke themselves until they passed out. Eight-year-old Lalani Erika Renee Walton of Temple, Texas died on July 15, 2021, and nine-year-old Arriani Jaileen Arroyo of Milwaukee, Wisconsin died on February 26, 2021. Both girls had received phones at young ages and became 'addicted' to TikTok, posting videos in hopes of becoming 'TikTok famous.' The lawsuit filed by their families in Los Angeles County Superior Court claims TikTok's algorithm 'intentionally and repeatedly' pushed videos of the deadly challenge into the children's feeds. Lalani had received her phone for her eighth birthday in April 2021 and had been watching blackout challenge videos repeatedly before her death. Police determined her death was 'a direct result of attempting TikTok's Blackout Challenge.' The families are represented by the Social Media Victims Law Center and seek unspecified damages, alleging TikTok knew the challenge was spreading and that its algorithm was feeding it to children.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI that exposes users to harmful, abusive, unsafe or inappropriate content. May involve providing advice or encouraging action. Examples of toxic content include hate speech, violence, extremism, illegal acts, or child sexual abuse material, as well as content that violates community norms such as profanity, inflammatory political speech, or pornography.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed