YouTube's recommendation algorithm and content moderation systems failed to detect videos with bestiality thumbnails that accumulated millions of views, requiring manual intervention only after media reporting.
YouTube's AI-powered content moderation and recommendation systems failed to properly detect and remove videos featuring graphic bestiality thumbnails that were easily discoverable through search queries like 'girl and her horse'. The incident involved dozens of videos with explicit thumbnails depicting women in sexual acts with animals, some accumulating over 2.3 million views and remaining on the platform for months. These videos, primarily originating from South Asian countries like Cambodia, used misleading thumbnails to drive viewership while the actual video content showed women caring for animals. YouTube's thumbnail monitoring technology failed to catch these violations because the images lacked typical pornography characteristics. The company's recommendation algorithm actively promoted these videos to users who clicked on initial results. A YouTube employee identified the content as similar to a Cambodian content farm previously removed in 2017. The videos were reported to YouTube via Twitter on April 19, 2018, but remained searchable days later. YouTube only began removing the content and terminating accounts after being contacted by BuzzFeed News, highlighting the platform's ongoing struggles with AI-powered content moderation across various harmful content categories including terrorism, child exploitation, and misinformation.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI that exposes users to harmful, abusive, unsafe or inappropriate content. May involve providing advice or encouraging action. Examples of toxic content include hate speech, violence, extremism, illegal acts, or child sexual abuse material, as well as content that violates community norms such as profanity, inflammatory political speech, or pornography.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed