Seven French families sued TikTok alleging its algorithm exposed their teenage children to harmful content promoting suicide, self-harm and eating disorders, leading to two deaths by suicide at age 15.
Seven French families filed a lawsuit against TikTok in the Créteil judicial court, represented by lawyer Laure Boutron-Marmion as part of the Algos Victima collective. The lawsuit alleges that TikTok's algorithm exposed seven teenage girls to videos promoting suicide, self-harm and eating disorders. Two of the teenagers died by suicide at age 15, including a girl named Marie who died in 2021. Four other teenagers attempted suicide and at least one developed an eating disorder. The families claim TikTok's algorithm trapped the teenagers in 'bubbles' of toxic content. This represents the first such grouped case in Europe. TikTok has previously stated it takes children's mental health issues seriously and prohibits content showing, promoting or sharing plans for suicide or self-harm. The company uses technology and moderation to enforce these standards. TikTok could not immediately be reached for comment on these specific allegations. The platform faces similar lawsuits in the United States and scrutiny from regulators including the European Union.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI that exposes users to harmful, abusive, unsafe or inappropriate content. May involve providing advice or encouraging action. Examples of toxic content include hate speech, violence, extremism, illegal acts, or child sexual abuse material, as well as content that violates community norms such as profanity, inflammatory political speech, or pornography.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed